Mar 18 18:02:22 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 18:02:22 crc restorecon[4751]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 18:02:22 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 18:02:23 crc restorecon[4751]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 18:02:23 crc kubenswrapper[5008]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 18:02:23 crc kubenswrapper[5008]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 18:02:23 crc kubenswrapper[5008]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 18:02:23 crc kubenswrapper[5008]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 18:02:23 crc kubenswrapper[5008]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 18:02:23 crc kubenswrapper[5008]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.927545 5008 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936128 5008 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936161 5008 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936171 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936179 5008 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936189 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936198 5008 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936207 5008 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936218 5008 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936228 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936251 5008 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936260 5008 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936268 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936277 5008 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936285 5008 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936296 5008 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936306 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936316 5008 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936326 5008 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936334 5008 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936342 5008 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936349 5008 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936356 5008 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936365 5008 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936373 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936380 5008 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936387 5008 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936395 5008 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936403 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936411 5008 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936418 5008 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936426 5008 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936433 5008 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936444 5008 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936454 5008 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936463 5008 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936472 5008 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936480 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936488 5008 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936497 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936506 5008 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936516 5008 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936524 5008 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936532 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936539 5008 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936548 5008 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936587 5008 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936595 5008 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936604 5008 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936612 5008 feature_gate.go:330] unrecognized feature gate: Example Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936620 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936628 5008 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936635 5008 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936643 5008 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936653 5008 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936662 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936669 5008 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936680 5008 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936689 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936700 5008 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936709 5008 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936718 5008 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936726 5008 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936734 5008 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936741 5008 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936749 5008 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936760 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936768 5008 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936775 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936783 5008 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936790 5008 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.936798 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937854 5008 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937878 5008 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937895 5008 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937906 5008 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937918 5008 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937927 5008 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937938 5008 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937949 5008 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937958 5008 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937968 5008 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937977 5008 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937986 5008 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.937995 5008 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938004 5008 flags.go:64] FLAG: --cgroup-root="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938012 5008 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938021 5008 flags.go:64] FLAG: --client-ca-file="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938029 5008 flags.go:64] FLAG: --cloud-config="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938038 5008 flags.go:64] FLAG: --cloud-provider="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938046 5008 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938057 5008 flags.go:64] FLAG: --cluster-domain="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938066 5008 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938075 5008 flags.go:64] FLAG: --config-dir="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938083 5008 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938109 5008 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938121 5008 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938130 5008 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938139 5008 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938148 5008 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938157 5008 flags.go:64] FLAG: --contention-profiling="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938166 5008 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938174 5008 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938186 5008 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938195 5008 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938206 5008 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938214 5008 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938223 5008 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938232 5008 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938240 5008 flags.go:64] FLAG: --enable-server="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938249 5008 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938261 5008 flags.go:64] FLAG: --event-burst="100" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938270 5008 flags.go:64] FLAG: --event-qps="50" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938279 5008 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938287 5008 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938296 5008 flags.go:64] FLAG: --eviction-hard="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938306 5008 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938315 5008 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938326 5008 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938334 5008 flags.go:64] FLAG: --eviction-soft="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938343 5008 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938352 5008 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938360 5008 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938369 5008 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938378 5008 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938387 5008 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938396 5008 flags.go:64] FLAG: --feature-gates="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938406 5008 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938415 5008 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938425 5008 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938433 5008 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938444 5008 flags.go:64] FLAG: --healthz-port="10248" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938453 5008 flags.go:64] FLAG: --help="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938461 5008 flags.go:64] FLAG: --hostname-override="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938471 5008 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938480 5008 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938489 5008 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938497 5008 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938505 5008 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938516 5008 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938525 5008 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938533 5008 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938543 5008 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938551 5008 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938588 5008 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938597 5008 flags.go:64] FLAG: --kube-reserved="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938606 5008 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938614 5008 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938623 5008 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938632 5008 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938641 5008 flags.go:64] FLAG: --lock-file="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938649 5008 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938658 5008 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938667 5008 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938679 5008 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938688 5008 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938697 5008 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938705 5008 flags.go:64] FLAG: --logging-format="text" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938714 5008 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938723 5008 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938732 5008 flags.go:64] FLAG: --manifest-url="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938740 5008 flags.go:64] FLAG: --manifest-url-header="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938751 5008 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938760 5008 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938770 5008 flags.go:64] FLAG: --max-pods="110" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938780 5008 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938788 5008 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938798 5008 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938807 5008 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938816 5008 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938825 5008 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938833 5008 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938853 5008 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938862 5008 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938871 5008 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938880 5008 flags.go:64] FLAG: --pod-cidr="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938890 5008 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938902 5008 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938911 5008 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938920 5008 flags.go:64] FLAG: --pods-per-core="0" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938929 5008 flags.go:64] FLAG: --port="10250" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938938 5008 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938946 5008 flags.go:64] FLAG: --provider-id="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938955 5008 flags.go:64] FLAG: --qos-reserved="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938963 5008 flags.go:64] FLAG: --read-only-port="10255" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938973 5008 flags.go:64] FLAG: --register-node="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938981 5008 flags.go:64] FLAG: --register-schedulable="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.938990 5008 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939003 5008 flags.go:64] FLAG: --registry-burst="10" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939012 5008 flags.go:64] FLAG: --registry-qps="5" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939020 5008 flags.go:64] FLAG: --reserved-cpus="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939029 5008 flags.go:64] FLAG: --reserved-memory="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939039 5008 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939048 5008 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939057 5008 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939066 5008 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939074 5008 flags.go:64] FLAG: --runonce="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939083 5008 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939092 5008 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939101 5008 flags.go:64] FLAG: --seccomp-default="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939110 5008 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939119 5008 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939128 5008 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939138 5008 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939147 5008 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939156 5008 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939165 5008 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939173 5008 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939182 5008 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939192 5008 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939200 5008 flags.go:64] FLAG: --system-cgroups="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939209 5008 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939222 5008 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939230 5008 flags.go:64] FLAG: --tls-cert-file="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939239 5008 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939249 5008 flags.go:64] FLAG: --tls-min-version="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939258 5008 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939266 5008 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939276 5008 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939285 5008 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939295 5008 flags.go:64] FLAG: --v="2" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939306 5008 flags.go:64] FLAG: --version="false" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939317 5008 flags.go:64] FLAG: --vmodule="" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939327 5008 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.939336 5008 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939532 5008 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939542 5008 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939577 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939586 5008 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939594 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939601 5008 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939611 5008 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939619 5008 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939627 5008 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939634 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939642 5008 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939649 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939659 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939667 5008 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939676 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939683 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939694 5008 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939703 5008 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939712 5008 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939720 5008 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939728 5008 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939737 5008 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939745 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939754 5008 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939762 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939769 5008 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939777 5008 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939784 5008 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939792 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939799 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939807 5008 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939814 5008 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939821 5008 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939829 5008 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939837 5008 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939844 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939853 5008 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939860 5008 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939868 5008 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939876 5008 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939883 5008 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939891 5008 feature_gate.go:330] unrecognized feature gate: Example Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939898 5008 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939905 5008 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939913 5008 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939921 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939928 5008 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939936 5008 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939946 5008 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939955 5008 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939963 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939971 5008 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939979 5008 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939986 5008 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.939994 5008 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940002 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940010 5008 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940017 5008 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940025 5008 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940035 5008 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940045 5008 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940053 5008 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940061 5008 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940071 5008 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940080 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940089 5008 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940096 5008 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940104 5008 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940113 5008 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940121 5008 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.940129 5008 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.940154 5008 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.953986 5008 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.954049 5008 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954619 5008 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954677 5008 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954687 5008 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954695 5008 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954704 5008 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954715 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954725 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954734 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954742 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954751 5008 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954759 5008 feature_gate.go:330] unrecognized feature gate: Example Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954767 5008 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954782 5008 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954792 5008 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954810 5008 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954817 5008 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954823 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954828 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954835 5008 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954840 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954846 5008 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954853 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954859 5008 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954865 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954872 5008 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954882 5008 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954887 5008 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954896 5008 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954910 5008 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954918 5008 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954926 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954933 5008 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954942 5008 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954952 5008 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954959 5008 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954967 5008 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954977 5008 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.954998 5008 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955004 5008 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955011 5008 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955017 5008 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955023 5008 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955028 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955035 5008 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955041 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955049 5008 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955057 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955064 5008 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955071 5008 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955081 5008 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955087 5008 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955094 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955100 5008 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955106 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955112 5008 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955118 5008 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955124 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955133 5008 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955139 5008 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955147 5008 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955153 5008 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955158 5008 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955179 5008 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955187 5008 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955194 5008 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955200 5008 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955207 5008 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955213 5008 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955219 5008 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955225 5008 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955230 5008 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.955242 5008 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955935 5008 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955966 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955977 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955988 5008 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.955998 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956007 5008 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956016 5008 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956024 5008 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956032 5008 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956039 5008 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956047 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956055 5008 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956063 5008 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956071 5008 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956079 5008 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956087 5008 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956094 5008 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956102 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956110 5008 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956118 5008 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956126 5008 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956134 5008 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956142 5008 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956150 5008 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956158 5008 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956167 5008 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956175 5008 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956183 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956192 5008 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956200 5008 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956208 5008 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956215 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956223 5008 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956231 5008 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956241 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956250 5008 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956258 5008 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956266 5008 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956278 5008 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956290 5008 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956298 5008 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956306 5008 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956314 5008 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956322 5008 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956333 5008 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956341 5008 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956349 5008 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956358 5008 feature_gate.go:330] unrecognized feature gate: Example Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956366 5008 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956374 5008 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956384 5008 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956392 5008 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956399 5008 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956407 5008 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956415 5008 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956423 5008 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956430 5008 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956438 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956446 5008 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956454 5008 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956462 5008 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956469 5008 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956478 5008 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956486 5008 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956495 5008 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956506 5008 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956515 5008 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956525 5008 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956534 5008 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956543 5008 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 18:02:23 crc kubenswrapper[5008]: W0318 18:02:23.956580 5008 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.956595 5008 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.957710 5008 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 18:02:23 crc kubenswrapper[5008]: E0318 18:02:23.962459 5008 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.969154 5008 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.969331 5008 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.971217 5008 server.go:997] "Starting client certificate rotation" Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.971269 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.971451 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 18:02:23 crc kubenswrapper[5008]: I0318 18:02:23.998321 5008 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.000069 5008 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.000732 5008 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.015736 5008 log.go:25] "Validated CRI v1 runtime API" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.055856 5008 log.go:25] "Validated CRI v1 image API" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.060118 5008 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.067262 5008 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-17-57-17-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.067329 5008 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.096409 5008 manager.go:217] Machine: {Timestamp:2026-03-18 18:02:24.092619328 +0000 UTC m=+0.612092487 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:85242208-ddaf-4ad1-b838-03a8e3bf165e BootID:8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0e:55:bd Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0e:55:bd Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:44:c6:42 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:63:4f:71 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c8:f0:93 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:16:fe:bf Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:9b:d7:4c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:06:84:44:df:40:f6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:96:a1:a0:bb:c3:16 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.096877 5008 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.097184 5008 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.098631 5008 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.098885 5008 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.098939 5008 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.099204 5008 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.099219 5008 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.099777 5008 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.099819 5008 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.100435 5008 state_mem.go:36] "Initialized new in-memory state store" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.101003 5008 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.104254 5008 kubelet.go:418] "Attempting to sync node with API server" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.104283 5008 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.104314 5008 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.104334 5008 kubelet.go:324] "Adding apiserver pod source" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.104349 5008 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.108430 5008 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.110107 5008 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 18:02:24 crc kubenswrapper[5008]: W0318 18:02:24.111260 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:24 crc kubenswrapper[5008]: W0318 18:02:24.111289 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.111450 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.111345 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.112267 5008 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114168 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114217 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114236 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114251 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114274 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114290 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114305 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114327 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114342 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114358 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114376 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.114392 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.115268 5008 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.115768 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.116157 5008 server.go:1280] "Started kubelet" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.117245 5008 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.117257 5008 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.118169 5008 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.118398 5008 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.118434 5008 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 18:02:24 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.118714 5008 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.118741 5008 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.118822 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.120821 5008 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 18:02:24 crc kubenswrapper[5008]: W0318 18:02:24.121458 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.121794 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.122236 5008 factory.go:55] Registering systemd factory Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.122419 5008 factory.go:221] Registration of the systemd container factory successfully Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.123179 5008 factory.go:153] Registering CRI-O factory Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.123238 5008 factory.go:221] Registration of the crio container factory successfully Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.123386 5008 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.123429 5008 factory.go:103] Registering Raw factory Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.123460 5008 manager.go:1196] Started watching for new ooms in manager Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.124930 5008 server.go:460] "Adding debug handlers to kubelet server" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.129277 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.125293 5008 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e017e0c668bc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.116108225 +0000 UTC m=+0.635581364,LastTimestamp:2026-03-18 18:02:24.116108225 +0000 UTC m=+0.635581364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.132989 5008 manager.go:319] Starting recovery of all containers Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141264 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141351 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141379 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141400 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141423 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141445 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141463 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141522 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141547 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141595 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141614 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141677 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141696 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141719 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141747 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141771 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141797 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141824 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141843 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141863 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141881 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141901 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141922 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141942 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141960 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.141979 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142006 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142028 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142084 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142104 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142126 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142146 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142166 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142188 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142216 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142235 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142322 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142344 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142367 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142388 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142410 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142429 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142449 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142471 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142494 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142516 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142537 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142583 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142603 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.142624 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.144640 5008 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.144798 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.144876 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.144967 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145061 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145130 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145193 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145255 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145317 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145393 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145461 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145572 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145636 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145694 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145761 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145818 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145879 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145940 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.145998 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146060 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146116 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146175 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146232 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146290 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146350 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146422 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146496 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146572 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146632 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146693 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146759 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146818 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146876 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146934 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.146990 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147056 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147115 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147176 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147246 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147305 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147378 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147457 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147541 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147634 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147696 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147758 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147823 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147884 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.147941 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148024 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148089 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148233 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148293 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148362 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148437 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148512 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148635 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148701 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148766 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148825 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148923 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.148981 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149045 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149109 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149171 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149229 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149297 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149363 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149444 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149527 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149703 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149764 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149832 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149890 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.149955 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150012 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150075 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150141 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150202 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150267 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150326 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150394 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150461 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150524 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150600 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150661 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150718 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150776 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150846 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150904 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.150965 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.151022 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.151081 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.151143 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.151206 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.151263 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.151319 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.151387 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.151451 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.152486 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.152573 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.152639 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.152696 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.152755 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.152822 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.152880 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.152938 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153003 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153062 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153132 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153197 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153254 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153315 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153384 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153451 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153537 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153641 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153699 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.153758 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154120 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154191 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154252 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154320 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154388 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154456 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154527 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154658 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154721 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154780 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154836 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154902 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.154965 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155024 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155079 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155134 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155189 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155254 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155316 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155391 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155455 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155519 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155617 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155686 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155747 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155812 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155873 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155933 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.155990 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156085 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156153 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156209 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156265 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156335 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156409 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156479 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156539 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156637 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156729 5008 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156796 5008 reconstruct.go:97] "Volume reconstruction finished" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.156848 5008 reconciler.go:26] "Reconciler: start to sync state" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.164123 5008 manager.go:324] Recovery completed Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.177790 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.179902 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.180000 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.180083 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.182308 5008 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.182387 5008 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.182452 5008 state_mem.go:36] "Initialized new in-memory state store" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.194210 5008 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.196897 5008 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.196969 5008 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.197014 5008 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.197097 5008 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.197404 5008 policy_none.go:49] "None policy: Start" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.198901 5008 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.198988 5008 state_mem.go:35] "Initializing new in-memory state store" Mar 18 18:02:24 crc kubenswrapper[5008]: W0318 18:02:24.201110 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.201453 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.219943 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.271100 5008 manager.go:334] "Starting Device Plugin manager" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.277758 5008 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.277815 5008 server.go:79] "Starting device plugin registration server" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.278509 5008 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.278537 5008 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.279848 5008 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.279981 5008 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.279991 5008 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.290142 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.297755 5008 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.297907 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.299416 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.299451 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.299464 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.299780 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.300074 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.300160 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.301033 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.301071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.301082 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.301211 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.301391 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.301445 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.301624 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.301685 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.301695 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.302187 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.302229 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.302238 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.302473 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.302767 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.302854 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.303601 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.303638 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.303653 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.303954 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.304026 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.304060 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.304390 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.304463 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.304506 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.305251 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.305293 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.305319 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.305922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.305973 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.305986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.306322 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.306382 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.306722 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.306745 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.306755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.307895 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.307931 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.307946 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.323301 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359391 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359490 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359582 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359649 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359703 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359763 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359810 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359857 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359902 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.359972 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.360041 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.360101 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.360147 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.360191 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.360286 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.380485 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.382628 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.382684 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.382703 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.382744 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.383345 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462222 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462304 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462347 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462387 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462423 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462462 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462537 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462606 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462625 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462696 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462747 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462649 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462647 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462632 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462816 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462823 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462870 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462894 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462906 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.462965 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463020 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463034 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463156 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463172 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463215 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463365 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463350 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463433 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463448 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.463623 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.583996 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.585960 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.586028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.586048 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.586092 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.586961 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.642177 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.652049 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.671635 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.688459 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.698714 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:02:24 crc kubenswrapper[5008]: W0318 18:02:24.723701 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d06d112ab695b9cab8b0b550989facfb5f24db1b8eca1fc5390c3c659fd92a82 WatchSource:0}: Error finding container d06d112ab695b9cab8b0b550989facfb5f24db1b8eca1fc5390c3c659fd92a82: Status 404 returned error can't find the container with id d06d112ab695b9cab8b0b550989facfb5f24db1b8eca1fc5390c3c659fd92a82 Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.725385 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Mar 18 18:02:24 crc kubenswrapper[5008]: W0318 18:02:24.728575 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d9038e7afddcf82366ecf30f550061afb123d4f8a80d12b542019f7308c62ef2 WatchSource:0}: Error finding container d9038e7afddcf82366ecf30f550061afb123d4f8a80d12b542019f7308c62ef2: Status 404 returned error can't find the container with id d9038e7afddcf82366ecf30f550061afb123d4f8a80d12b542019f7308c62ef2 Mar 18 18:02:24 crc kubenswrapper[5008]: W0318 18:02:24.730140 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-138358c3e0ad0a6bef2b5430f5b40ab09200abcb4a875e4cb6f6bdde10c2f392 WatchSource:0}: Error finding container 138358c3e0ad0a6bef2b5430f5b40ab09200abcb4a875e4cb6f6bdde10c2f392: Status 404 returned error can't find the container with id 138358c3e0ad0a6bef2b5430f5b40ab09200abcb4a875e4cb6f6bdde10c2f392 Mar 18 18:02:24 crc kubenswrapper[5008]: W0318 18:02:24.748452 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4a108cfd65d8ef13af354c24fe55a3560f569c197db847fbb7f5365d09bd1150 WatchSource:0}: Error finding container 4a108cfd65d8ef13af354c24fe55a3560f569c197db847fbb7f5365d09bd1150: Status 404 returned error can't find the container with id 4a108cfd65d8ef13af354c24fe55a3560f569c197db847fbb7f5365d09bd1150 Mar 18 18:02:24 crc kubenswrapper[5008]: W0318 18:02:24.978674 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.978819 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.987324 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.990093 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.990160 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.990181 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:24 crc kubenswrapper[5008]: I0318 18:02:24.990236 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:02:24 crc kubenswrapper[5008]: E0318 18:02:24.991154 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.117465 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:25 crc kubenswrapper[5008]: W0318 18:02:25.162375 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:25 crc kubenswrapper[5008]: E0318 18:02:25.162601 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.203716 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4a108cfd65d8ef13af354c24fe55a3560f569c197db847fbb7f5365d09bd1150"} Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.206105 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6d5f1f152d578f40838a9a81589216f29fca56a1b14c1cbd615306ba7ff845cb"} Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.208721 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"138358c3e0ad0a6bef2b5430f5b40ab09200abcb4a875e4cb6f6bdde10c2f392"} Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.210605 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d9038e7afddcf82366ecf30f550061afb123d4f8a80d12b542019f7308c62ef2"} Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.212167 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d06d112ab695b9cab8b0b550989facfb5f24db1b8eca1fc5390c3c659fd92a82"} Mar 18 18:02:25 crc kubenswrapper[5008]: W0318 18:02:25.388880 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:25 crc kubenswrapper[5008]: E0318 18:02:25.389662 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:25 crc kubenswrapper[5008]: E0318 18:02:25.526611 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Mar 18 18:02:25 crc kubenswrapper[5008]: W0318 18:02:25.583383 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:25 crc kubenswrapper[5008]: E0318 18:02:25.583521 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.791500 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.793705 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.793769 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.793788 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:25 crc kubenswrapper[5008]: I0318 18:02:25.793829 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:02:25 crc kubenswrapper[5008]: E0318 18:02:25.794605 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.116884 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.117840 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:26 crc kubenswrapper[5008]: E0318 18:02:26.118231 5008 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.221763 5008 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f" exitCode=0 Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.221941 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.221955 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f"} Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.223538 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.223623 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.223638 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.225346 5008 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946" exitCode=0 Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.225427 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946"} Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.225495 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.226755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.226792 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.226803 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.230061 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2101ff8ac77ab3c2ad89a85919eadea6336e52c1fa8fa35b30d2310a185a85f"} Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.230118 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b"} Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.230142 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec"} Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.233293 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82" exitCode=0 Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.233439 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.233428 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82"} Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.234476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.234511 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.234539 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.240193 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.241817 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.241915 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.241935 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.242315 5008 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae" exitCode=0 Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.242372 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae"} Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.242535 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.246179 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.246216 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:26 crc kubenswrapper[5008]: I0318 18:02:26.246232 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.117014 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:27 crc kubenswrapper[5008]: E0318 18:02:27.128074 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.245270 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.245315 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.245324 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.245403 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.246444 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.246467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.246475 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.248070 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0e8d12a8adedc6347e344404eefcb3508701c3ae0c0c6a3c405a9b789262fd5c"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.248139 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.248738 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.248755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.248762 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.250387 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.250406 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.250415 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.250423 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.251330 5008 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a" exitCode=0 Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.251362 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.251431 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.251932 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.251958 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.251971 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.254692 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4"} Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.254764 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.255796 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.255835 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.255845 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.397357 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.399267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.399303 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.399316 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:27 crc kubenswrapper[5008]: I0318 18:02:27.399341 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:02:27 crc kubenswrapper[5008]: E0318 18:02:27.399885 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Mar 18 18:02:27 crc kubenswrapper[5008]: W0318 18:02:27.445787 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Mar 18 18:02:27 crc kubenswrapper[5008]: E0318 18:02:27.445877 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.260775 5008 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e" exitCode=0 Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.260898 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e"} Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.261055 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.262693 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.262747 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.262766 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.266615 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d10a7998b27bdc76be6dc4ea812d1215607dd26128a7ad088d21db1dbbd2a984"} Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.266776 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.266784 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.266895 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.266895 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.267108 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.268836 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.268895 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.268904 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.268922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.268945 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.269013 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.268889 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.268946 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.269096 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.269121 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.269100 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.269197 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.616987 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:28 crc kubenswrapper[5008]: I0318 18:02:28.790988 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.276882 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.277346 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e"} Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.277419 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6"} Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.277449 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157"} Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.277698 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.277796 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.278656 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.278723 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.278748 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.281726 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.281784 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:29 crc kubenswrapper[5008]: I0318 18:02:29.281807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.223897 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.286272 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697"} Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.286351 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9"} Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.286375 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.286400 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.286456 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.287936 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.287990 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.288010 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.288068 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.288135 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.288159 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.463275 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.495731 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.601029 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.603192 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.603280 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.603300 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:30 crc kubenswrapper[5008]: I0318 18:02:30.603351 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:02:31 crc kubenswrapper[5008]: I0318 18:02:31.261365 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:31 crc kubenswrapper[5008]: I0318 18:02:31.261676 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:31 crc kubenswrapper[5008]: I0318 18:02:31.263512 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:31 crc kubenswrapper[5008]: I0318 18:02:31.263606 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:31 crc kubenswrapper[5008]: I0318 18:02:31.263620 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:31 crc kubenswrapper[5008]: I0318 18:02:31.289118 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:31 crc kubenswrapper[5008]: I0318 18:02:31.290769 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:31 crc kubenswrapper[5008]: I0318 18:02:31.290829 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:31 crc kubenswrapper[5008]: I0318 18:02:31.290847 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.292267 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.294211 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.294277 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.294297 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.464234 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.464466 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.466019 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.466096 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.466116 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.631872 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.632051 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.633367 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.633446 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.633467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:32 crc kubenswrapper[5008]: I0318 18:02:32.970636 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:33 crc kubenswrapper[5008]: I0318 18:02:33.294608 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:33 crc kubenswrapper[5008]: I0318 18:02:33.296501 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:33 crc kubenswrapper[5008]: I0318 18:02:33.296579 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:33 crc kubenswrapper[5008]: I0318 18:02:33.296597 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:34 crc kubenswrapper[5008]: I0318 18:02:34.073878 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:34 crc kubenswrapper[5008]: I0318 18:02:34.074139 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:34 crc kubenswrapper[5008]: I0318 18:02:34.076061 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:34 crc kubenswrapper[5008]: I0318 18:02:34.076132 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:34 crc kubenswrapper[5008]: I0318 18:02:34.076145 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:34 crc kubenswrapper[5008]: I0318 18:02:34.261843 5008 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:02:34 crc kubenswrapper[5008]: I0318 18:02:34.261975 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:02:34 crc kubenswrapper[5008]: E0318 18:02:34.290369 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:02:35 crc kubenswrapper[5008]: I0318 18:02:35.648391 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:35 crc kubenswrapper[5008]: I0318 18:02:35.648678 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:35 crc kubenswrapper[5008]: I0318 18:02:35.650751 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:35 crc kubenswrapper[5008]: I0318 18:02:35.650830 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:35 crc kubenswrapper[5008]: I0318 18:02:35.650851 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:35 crc kubenswrapper[5008]: I0318 18:02:35.655096 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:36 crc kubenswrapper[5008]: I0318 18:02:36.303035 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:36 crc kubenswrapper[5008]: I0318 18:02:36.304500 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:36 crc kubenswrapper[5008]: I0318 18:02:36.304606 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:36 crc kubenswrapper[5008]: I0318 18:02:36.304628 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:36 crc kubenswrapper[5008]: I0318 18:02:36.310457 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:37 crc kubenswrapper[5008]: I0318 18:02:37.305872 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:37 crc kubenswrapper[5008]: I0318 18:02:37.307446 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:37 crc kubenswrapper[5008]: I0318 18:02:37.307492 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:37 crc kubenswrapper[5008]: I0318 18:02:37.307506 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:37 crc kubenswrapper[5008]: W0318 18:02:37.648440 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 18:02:37 crc kubenswrapper[5008]: I0318 18:02:37.648613 5008 trace.go:236] Trace[1237384828]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 18:02:27.646) (total time: 10001ms): Mar 18 18:02:37 crc kubenswrapper[5008]: Trace[1237384828]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:02:37.648) Mar 18 18:02:37 crc kubenswrapper[5008]: Trace[1237384828]: [10.001758944s] [10.001758944s] END Mar 18 18:02:37 crc kubenswrapper[5008]: E0318 18:02:37.648651 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 18:02:37 crc kubenswrapper[5008]: W0318 18:02:37.796988 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 18:02:37 crc kubenswrapper[5008]: I0318 18:02:37.797134 5008 trace.go:236] Trace[1292100395]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 18:02:27.795) (total time: 10001ms): Mar 18 18:02:37 crc kubenswrapper[5008]: Trace[1292100395]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:02:37.796) Mar 18 18:02:37 crc kubenswrapper[5008]: Trace[1292100395]: [10.001833586s] [10.001833586s] END Mar 18 18:02:37 crc kubenswrapper[5008]: E0318 18:02:37.797169 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 18:02:37 crc kubenswrapper[5008]: W0318 18:02:37.992840 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 18:02:37 crc kubenswrapper[5008]: I0318 18:02:37.992996 5008 trace.go:236] Trace[1312488793]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 18:02:27.990) (total time: 10002ms): Mar 18 18:02:37 crc kubenswrapper[5008]: Trace[1312488793]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (18:02:37.992) Mar 18 18:02:37 crc kubenswrapper[5008]: Trace[1312488793]: [10.002591228s] [10.002591228s] END Mar 18 18:02:37 crc kubenswrapper[5008]: E0318 18:02:37.993034 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 18:02:38 crc kubenswrapper[5008]: I0318 18:02:38.118354 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 18 18:02:38 crc kubenswrapper[5008]: E0318 18:02:38.173766 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:38Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 18:02:38 crc kubenswrapper[5008]: E0318 18:02:38.175394 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:38Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 18:02:38 crc kubenswrapper[5008]: W0318 18:02:38.177782 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:38Z is after 2026-02-23T05:33:13Z Mar 18 18:02:38 crc kubenswrapper[5008]: E0318 18:02:38.177962 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 18:02:38 crc kubenswrapper[5008]: E0318 18:02:38.182901 5008 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:38Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e017e0c668bc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.116108225 +0000 UTC m=+0.635581364,LastTimestamp:2026-03-18 18:02:24.116108225 +0000 UTC m=+0.635581364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:38 crc kubenswrapper[5008]: E0318 18:02:38.186178 5008 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 18:02:38 crc kubenswrapper[5008]: I0318 18:02:38.194670 5008 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 18:02:38 crc kubenswrapper[5008]: I0318 18:02:38.194734 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 18:02:38 crc kubenswrapper[5008]: I0318 18:02:38.208089 5008 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 18:02:38 crc kubenswrapper[5008]: I0318 18:02:38.208181 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 18:02:38 crc kubenswrapper[5008]: I0318 18:02:38.630772 5008 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]log ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]etcd ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-filter ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-apiextensions-informers ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-apiextensions-controllers ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/crd-informer-synced ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-system-namespaces-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 18 18:02:38 crc kubenswrapper[5008]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 18 18:02:38 crc kubenswrapper[5008]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/bootstrap-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/start-kube-aggregator-informers ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/apiservice-registration-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/apiservice-discovery-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]autoregister-completion ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/apiservice-openapi-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 18 18:02:38 crc kubenswrapper[5008]: livez check failed Mar 18 18:02:38 crc kubenswrapper[5008]: I0318 18:02:38.630960 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:02:39 crc kubenswrapper[5008]: I0318 18:02:39.121443 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:39Z is after 2026-02-23T05:33:13Z Mar 18 18:02:39 crc kubenswrapper[5008]: I0318 18:02:39.312152 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 18:02:39 crc kubenswrapper[5008]: I0318 18:02:39.314342 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d10a7998b27bdc76be6dc4ea812d1215607dd26128a7ad088d21db1dbbd2a984" exitCode=255 Mar 18 18:02:39 crc kubenswrapper[5008]: I0318 18:02:39.314400 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d10a7998b27bdc76be6dc4ea812d1215607dd26128a7ad088d21db1dbbd2a984"} Mar 18 18:02:39 crc kubenswrapper[5008]: I0318 18:02:39.314590 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:39 crc kubenswrapper[5008]: I0318 18:02:39.315530 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:39 crc kubenswrapper[5008]: I0318 18:02:39.315587 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:39 crc kubenswrapper[5008]: I0318 18:02:39.315604 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:39 crc kubenswrapper[5008]: I0318 18:02:39.316172 5008 scope.go:117] "RemoveContainer" containerID="d10a7998b27bdc76be6dc4ea812d1215607dd26128a7ad088d21db1dbbd2a984" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.120648 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:40Z is after 2026-02-23T05:33:13Z Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.320984 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.324209 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e"} Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.324406 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.325694 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.325746 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.325763 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.503724 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.503961 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.505833 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.505903 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.505921 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:40 crc kubenswrapper[5008]: I0318 18:02:40.527945 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.122603 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:41Z is after 2026-02-23T05:33:13Z Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.329311 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.330243 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.333413 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e" exitCode=255 Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.333519 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e"} Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.333655 5008 scope.go:117] "RemoveContainer" containerID="d10a7998b27bdc76be6dc4ea812d1215607dd26128a7ad088d21db1dbbd2a984" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.333685 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.333927 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.335629 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.335755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.335846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.335775 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.335942 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.335963 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:41 crc kubenswrapper[5008]: I0318 18:02:41.337151 5008 scope.go:117] "RemoveContainer" containerID="c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e" Mar 18 18:02:41 crc kubenswrapper[5008]: E0318 18:02:41.337382 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:02:41 crc kubenswrapper[5008]: W0318 18:02:41.956262 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:41Z is after 2026-02-23T05:33:13Z Mar 18 18:02:41 crc kubenswrapper[5008]: E0318 18:02:41.956430 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 18:02:42 crc kubenswrapper[5008]: I0318 18:02:42.121881 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:42Z is after 2026-02-23T05:33:13Z Mar 18 18:02:42 crc kubenswrapper[5008]: I0318 18:02:42.341704 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 18:02:42 crc kubenswrapper[5008]: I0318 18:02:42.464494 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:42 crc kubenswrapper[5008]: I0318 18:02:42.464731 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:42 crc kubenswrapper[5008]: I0318 18:02:42.465894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:42 crc kubenswrapper[5008]: I0318 18:02:42.465934 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:42 crc kubenswrapper[5008]: I0318 18:02:42.465950 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:42 crc kubenswrapper[5008]: I0318 18:02:42.466726 5008 scope.go:117] "RemoveContainer" containerID="c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e" Mar 18 18:02:42 crc kubenswrapper[5008]: E0318 18:02:42.466987 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:02:43 crc kubenswrapper[5008]: I0318 18:02:43.122324 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:43Z is after 2026-02-23T05:33:13Z Mar 18 18:02:43 crc kubenswrapper[5008]: W0318 18:02:43.246619 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:43Z is after 2026-02-23T05:33:13Z Mar 18 18:02:43 crc kubenswrapper[5008]: E0318 18:02:43.246732 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 18:02:43 crc kubenswrapper[5008]: W0318 18:02:43.392660 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:43Z is after 2026-02-23T05:33:13Z Mar 18 18:02:43 crc kubenswrapper[5008]: E0318 18:02:43.392770 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 18:02:43 crc kubenswrapper[5008]: I0318 18:02:43.628857 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:43 crc kubenswrapper[5008]: I0318 18:02:43.629058 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:43 crc kubenswrapper[5008]: I0318 18:02:43.630436 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:43 crc kubenswrapper[5008]: I0318 18:02:43.630491 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:43 crc kubenswrapper[5008]: I0318 18:02:43.630503 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:43 crc kubenswrapper[5008]: I0318 18:02:43.631645 5008 scope.go:117] "RemoveContainer" containerID="c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e" Mar 18 18:02:43 crc kubenswrapper[5008]: E0318 18:02:43.631854 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:02:43 crc kubenswrapper[5008]: I0318 18:02:43.633596 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.119646 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:44Z is after 2026-02-23T05:33:13Z Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.262584 5008 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.262726 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:02:44 crc kubenswrapper[5008]: E0318 18:02:44.290901 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.351268 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.352755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.352816 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.352831 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.353688 5008 scope.go:117] "RemoveContainer" containerID="c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e" Mar 18 18:02:44 crc kubenswrapper[5008]: E0318 18:02:44.353934 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.574648 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.576758 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.576819 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.576840 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:44 crc kubenswrapper[5008]: I0318 18:02:44.576879 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:02:44 crc kubenswrapper[5008]: E0318 18:02:44.581462 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:44Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 18:02:44 crc kubenswrapper[5008]: E0318 18:02:44.584765 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:44Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 18:02:45 crc kubenswrapper[5008]: I0318 18:02:45.098897 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:02:45 crc kubenswrapper[5008]: I0318 18:02:45.122441 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:02:45Z is after 2026-02-23T05:33:13Z Mar 18 18:02:45 crc kubenswrapper[5008]: I0318 18:02:45.354283 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:45 crc kubenswrapper[5008]: I0318 18:02:45.355867 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:45 crc kubenswrapper[5008]: I0318 18:02:45.355932 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:45 crc kubenswrapper[5008]: I0318 18:02:45.355958 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:45 crc kubenswrapper[5008]: I0318 18:02:45.356996 5008 scope.go:117] "RemoveContainer" containerID="c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e" Mar 18 18:02:45 crc kubenswrapper[5008]: E0318 18:02:45.357417 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:02:46 crc kubenswrapper[5008]: I0318 18:02:46.124379 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:46 crc kubenswrapper[5008]: I0318 18:02:46.771812 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 18:02:46 crc kubenswrapper[5008]: I0318 18:02:46.799135 5008 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 18:02:47 crc kubenswrapper[5008]: I0318 18:02:47.122840 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:48 crc kubenswrapper[5008]: I0318 18:02:48.124170 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.190822 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e0c668bc1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.116108225 +0000 UTC m=+0.635581364,LastTimestamp:2026-03-18 18:02:24.116108225 +0000 UTC m=+0.635581364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.198284 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10353617 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.179983895 +0000 UTC m=+0.699456984,LastTimestamp:2026-03-18 18:02:24.179983895 +0000 UTC m=+0.699456984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.205468 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e1036a3c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180077506 +0000 UTC m=+0.699550605,LastTimestamp:2026-03-18 18:02:24.180077506 +0000 UTC m=+0.699550605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.213128 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10378315 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180134677 +0000 UTC m=+0.699607766,LastTimestamp:2026-03-18 18:02:24.180134677 +0000 UTC m=+0.699607766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.219949 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e162a19be default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.279919038 +0000 UTC m=+0.799392117,LastTimestamp:2026-03-18 18:02:24.279919038 +0000 UTC m=+0.799392117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.227798 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10353617\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10353617 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.179983895 +0000 UTC m=+0.699456984,LastTimestamp:2026-03-18 18:02:24.299441335 +0000 UTC m=+0.818914414,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.235530 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e1036a3c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e1036a3c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180077506 +0000 UTC m=+0.699550605,LastTimestamp:2026-03-18 18:02:24.299460155 +0000 UTC m=+0.818933224,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.242444 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10378315\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10378315 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180134677 +0000 UTC m=+0.699607766,LastTimestamp:2026-03-18 18:02:24.299471376 +0000 UTC m=+0.818944455,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.248980 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10353617\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10353617 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.179983895 +0000 UTC m=+0.699456984,LastTimestamp:2026-03-18 18:02:24.301053624 +0000 UTC m=+0.820526703,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.255625 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e1036a3c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e1036a3c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180077506 +0000 UTC m=+0.699550605,LastTimestamp:2026-03-18 18:02:24.301078785 +0000 UTC m=+0.820551864,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.263014 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10378315\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10378315 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180134677 +0000 UTC m=+0.699607766,LastTimestamp:2026-03-18 18:02:24.301089225 +0000 UTC m=+0.820562304,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.270163 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10353617\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10353617 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.179983895 +0000 UTC m=+0.699456984,LastTimestamp:2026-03-18 18:02:24.301663933 +0000 UTC m=+0.821137012,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.277743 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e1036a3c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e1036a3c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180077506 +0000 UTC m=+0.699550605,LastTimestamp:2026-03-18 18:02:24.301691613 +0000 UTC m=+0.821164692,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.286760 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10378315\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10378315 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180134677 +0000 UTC m=+0.699607766,LastTimestamp:2026-03-18 18:02:24.301701054 +0000 UTC m=+0.821174133,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.295387 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10353617\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10353617 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.179983895 +0000 UTC m=+0.699456984,LastTimestamp:2026-03-18 18:02:24.30222064 +0000 UTC m=+0.821693709,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.301305 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e1036a3c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e1036a3c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180077506 +0000 UTC m=+0.699550605,LastTimestamp:2026-03-18 18:02:24.30223475 +0000 UTC m=+0.821707829,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.309697 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10378315\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10378315 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180134677 +0000 UTC m=+0.699607766,LastTimestamp:2026-03-18 18:02:24.30224378 +0000 UTC m=+0.821716859,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.316929 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10353617\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10353617 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.179983895 +0000 UTC m=+0.699456984,LastTimestamp:2026-03-18 18:02:24.303629823 +0000 UTC m=+0.823102912,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.323974 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e1036a3c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e1036a3c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180077506 +0000 UTC m=+0.699550605,LastTimestamp:2026-03-18 18:02:24.303647343 +0000 UTC m=+0.823120442,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.331939 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10378315\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10378315 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180134677 +0000 UTC m=+0.699607766,LastTimestamp:2026-03-18 18:02:24.303660964 +0000 UTC m=+0.823134063,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.338320 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10353617\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10353617 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.179983895 +0000 UTC m=+0.699456984,LastTimestamp:2026-03-18 18:02:24.303998884 +0000 UTC m=+0.823472003,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.343881 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e1036a3c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e1036a3c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180077506 +0000 UTC m=+0.699550605,LastTimestamp:2026-03-18 18:02:24.304044235 +0000 UTC m=+0.823517344,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.350231 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10378315\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10378315 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180134677 +0000 UTC m=+0.699607766,LastTimestamp:2026-03-18 18:02:24.304075186 +0000 UTC m=+0.823548325,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.356950 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e10353617\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e10353617 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.179983895 +0000 UTC m=+0.699456984,LastTimestamp:2026-03-18 18:02:24.305276503 +0000 UTC m=+0.824749592,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.363603 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e017e1036a3c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e017e1036a3c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.180077506 +0000 UTC m=+0.699550605,LastTimestamp:2026-03-18 18:02:24.305312744 +0000 UTC m=+0.824785843,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.371920 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017e31844a6b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.738814571 +0000 UTC m=+1.258287690,LastTimestamp:2026-03-18 18:02:24.738814571 +0000 UTC m=+1.258287690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.377278 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017e3184f282 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.738857602 +0000 UTC m=+1.258330721,LastTimestamp:2026-03-18 18:02:24.738857602 +0000 UTC m=+1.258330721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.382303 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e017e318593e2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.738898914 +0000 UTC m=+1.258372033,LastTimestamp:2026-03-18 18:02:24.738898914 +0000 UTC m=+1.258372033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.388848 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e31d62d49 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.744181065 +0000 UTC m=+1.263654184,LastTimestamp:2026-03-18 18:02:24.744181065 +0000 UTC m=+1.263654184,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.395212 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017e325ecc43 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:24.753134659 +0000 UTC m=+1.272607778,LastTimestamp:2026-03-18 18:02:24.753134659 +0000 UTC m=+1.272607778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.401115 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e5c80473b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.459971899 +0000 UTC m=+1.979445018,LastTimestamp:2026-03-18 18:02:25.459971899 +0000 UTC m=+1.979445018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.403096 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e017e5c8ec79c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.460922268 +0000 UTC m=+1.980395387,LastTimestamp:2026-03-18 18:02:25.460922268 +0000 UTC m=+1.980395387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.407125 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017e5c995af5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.461615349 +0000 UTC m=+1.981088468,LastTimestamp:2026-03-18 18:02:25.461615349 +0000 UTC m=+1.981088468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.412495 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017e5ca26841 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.462208577 +0000 UTC m=+1.981681696,LastTimestamp:2026-03-18 18:02:25.462208577 +0000 UTC m=+1.981681696,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.416607 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017e5ccae16e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.464861038 +0000 UTC m=+1.984334157,LastTimestamp:2026-03-18 18:02:25.464861038 +0000 UTC m=+1.984334157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.421498 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017e5d603a5b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.474648667 +0000 UTC m=+1.994121786,LastTimestamp:2026-03-18 18:02:25.474648667 +0000 UTC m=+1.994121786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.425650 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e017e5daeb048 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.479790664 +0000 UTC m=+1.999263773,LastTimestamp:2026-03-18 18:02:25.479790664 +0000 UTC m=+1.999263773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.430295 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e5dc16f9c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.481019292 +0000 UTC m=+2.000492411,LastTimestamp:2026-03-18 18:02:25.481019292 +0000 UTC m=+2.000492411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.434631 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017e5dd49a70 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.48227544 +0000 UTC m=+2.001748549,LastTimestamp:2026-03-18 18:02:25.48227544 +0000 UTC m=+2.001748549,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.440527 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e5de4479c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.483302812 +0000 UTC m=+2.002775901,LastTimestamp:2026-03-18 18:02:25.483302812 +0000 UTC m=+2.002775901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.445420 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017e5e6c725e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.492226654 +0000 UTC m=+2.011699773,LastTimestamp:2026-03-18 18:02:25.492226654 +0000 UTC m=+2.011699773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.449536 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e7306bfe7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.837883367 +0000 UTC m=+2.357356486,LastTimestamp:2026-03-18 18:02:25.837883367 +0000 UTC m=+2.357356486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.453269 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e73ae9a15 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.848883733 +0000 UTC m=+2.368356852,LastTimestamp:2026-03-18 18:02:25.848883733 +0000 UTC m=+2.368356852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.457971 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e73c97bb3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.850645427 +0000 UTC m=+2.370118536,LastTimestamp:2026-03-18 18:02:25.850645427 +0000 UTC m=+2.370118536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.462765 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e835fc151 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.112151889 +0000 UTC m=+2.631624998,LastTimestamp:2026-03-18 18:02:26.112151889 +0000 UTC m=+2.631624998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.466548 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e844d796f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.127731055 +0000 UTC m=+2.647204174,LastTimestamp:2026-03-18 18:02:26.127731055 +0000 UTC m=+2.647204174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.472917 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e84655a8d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.129296013 +0000 UTC m=+2.648769122,LastTimestamp:2026-03-18 18:02:26.129296013 +0000 UTC m=+2.648769122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.480839 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e017e8a1fa018 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.225389592 +0000 UTC m=+2.744862671,LastTimestamp:2026-03-18 18:02:26.225389592 +0000 UTC m=+2.744862671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.486391 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017e8a4b2dbf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.228243903 +0000 UTC m=+2.747716992,LastTimestamp:2026-03-18 18:02:26.228243903 +0000 UTC m=+2.747716992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.491670 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017e8afea986 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.240006534 +0000 UTC m=+2.759479613,LastTimestamp:2026-03-18 18:02:26.240006534 +0000 UTC m=+2.759479613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.495758 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017e8b70e9bb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.247494075 +0000 UTC m=+2.766967154,LastTimestamp:2026-03-18 18:02:26.247494075 +0000 UTC m=+2.766967154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.500704 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e9347f066 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.379026534 +0000 UTC m=+2.898499613,LastTimestamp:2026-03-18 18:02:26.379026534 +0000 UTC m=+2.898499613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.505124 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e94f489a8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.407115176 +0000 UTC m=+2.926588255,LastTimestamp:2026-03-18 18:02:26.407115176 +0000 UTC m=+2.926588255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.507007 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017e967c7de9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.432802281 +0000 UTC m=+2.952275360,LastTimestamp:2026-03-18 18:02:26.432802281 +0000 UTC m=+2.952275360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.509356 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017e977f1961 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.449750369 +0000 UTC m=+2.969223448,LastTimestamp:2026-03-18 18:02:26.449750369 +0000 UTC m=+2.969223448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.513045 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017e979cff6c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.451709804 +0000 UTC m=+2.971182883,LastTimestamp:2026-03-18 18:02:26.451709804 +0000 UTC m=+2.971182883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.518141 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e017e97a3dc77 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.452159607 +0000 UTC m=+2.971632686,LastTimestamp:2026-03-18 18:02:26.452159607 +0000 UTC m=+2.971632686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.523758 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017e97ae35ea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.452837866 +0000 UTC m=+2.972310955,LastTimestamp:2026-03-18 18:02:26.452837866 +0000 UTC m=+2.972310955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.530147 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017e97af241f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.452898847 +0000 UTC m=+2.972371926,LastTimestamp:2026-03-18 18:02:26.452898847 +0000 UTC m=+2.972371926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.534523 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017e986f5db7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.465496503 +0000 UTC m=+2.984969582,LastTimestamp:2026-03-18 18:02:26.465496503 +0000 UTC m=+2.984969582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.540864 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017e987f3a60 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.466536032 +0000 UTC m=+2.986009101,LastTimestamp:2026-03-18 18:02:26.466536032 +0000 UTC m=+2.986009101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.546344 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017e98e645d4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.473289172 +0000 UTC m=+2.992762251,LastTimestamp:2026-03-18 18:02:26.473289172 +0000 UTC m=+2.992762251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.550325 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e017e990c6981 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.475788673 +0000 UTC m=+2.995261752,LastTimestamp:2026-03-18 18:02:26.475788673 +0000 UTC m=+2.995261752,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.554972 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017ea3cf3444 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.656326724 +0000 UTC m=+3.175799803,LastTimestamp:2026-03-18 18:02:26.656326724 +0000 UTC m=+3.175799803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.558948 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017ea49df86c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.669877356 +0000 UTC m=+3.189350435,LastTimestamp:2026-03-18 18:02:26.669877356 +0000 UTC m=+3.189350435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.563413 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017ea4c9c849 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.672748617 +0000 UTC m=+3.192221686,LastTimestamp:2026-03-18 18:02:26.672748617 +0000 UTC m=+3.192221686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.569641 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017ea4e3b851 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.674448465 +0000 UTC m=+3.193921544,LastTimestamp:2026-03-18 18:02:26.674448465 +0000 UTC m=+3.193921544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.574229 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017ea5e175e4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.691077604 +0000 UTC m=+3.210550703,LastTimestamp:2026-03-18 18:02:26.691077604 +0000 UTC m=+3.210550703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.579115 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017ea602fa68 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.693274216 +0000 UTC m=+3.212747305,LastTimestamp:2026-03-18 18:02:26.693274216 +0000 UTC m=+3.212747305,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.582874 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017eb17c8b96 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.885790614 +0000 UTC m=+3.405263693,LastTimestamp:2026-03-18 18:02:26.885790614 +0000 UTC m=+3.405263693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.586736 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017eb1acc590 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.888951184 +0000 UTC m=+3.408424263,LastTimestamp:2026-03-18 18:02:26.888951184 +0000 UTC m=+3.408424263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.590619 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017eb2455059 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.898948185 +0000 UTC m=+3.418421264,LastTimestamp:2026-03-18 18:02:26.898948185 +0000 UTC m=+3.418421264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.595204 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017eb2574e61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.900127329 +0000 UTC m=+3.419600408,LastTimestamp:2026-03-18 18:02:26.900127329 +0000 UTC m=+3.419600408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.600011 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e017eb27f6157 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:26.902753623 +0000 UTC m=+3.422226702,LastTimestamp:2026-03-18 18:02:26.902753623 +0000 UTC m=+3.422226702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.604237 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017ebdfe73aa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:27.095630762 +0000 UTC m=+3.615103851,LastTimestamp:2026-03-18 18:02:27.095630762 +0000 UTC m=+3.615103851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.608860 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017ebee92d9a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:27.111013786 +0000 UTC m=+3.630486875,LastTimestamp:2026-03-18 18:02:27.111013786 +0000 UTC m=+3.630486875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.613660 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017ebefe0b50 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:27.112381264 +0000 UTC m=+3.631854353,LastTimestamp:2026-03-18 18:02:27.112381264 +0000 UTC m=+3.631854353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.620957 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017ec7699c36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:27.253648438 +0000 UTC m=+3.773121517,LastTimestamp:2026-03-18 18:02:27.253648438 +0000 UTC m=+3.773121517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.625738 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017eca273c7c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:27.299630204 +0000 UTC m=+3.819103283,LastTimestamp:2026-03-18 18:02:27.299630204 +0000 UTC m=+3.819103283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.629577 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017ecad6f405 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:27.311145989 +0000 UTC m=+3.830619078,LastTimestamp:2026-03-18 18:02:27.311145989 +0000 UTC m=+3.830619078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.634895 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017ed57492d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:27.489247952 +0000 UTC m=+4.008721031,LastTimestamp:2026-03-18 18:02:27.489247952 +0000 UTC m=+4.008721031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.641881 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017ed6cf3ed2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:27.511967442 +0000 UTC m=+4.031440551,LastTimestamp:2026-03-18 18:02:27.511967442 +0000 UTC m=+4.031440551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.650203 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f03b7312e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:28.265365806 +0000 UTC m=+4.784838925,LastTimestamp:2026-03-18 18:02:28.265365806 +0000 UTC m=+4.784838925,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.657155 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f139a5c35 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:28.531911733 +0000 UTC m=+5.051384852,LastTimestamp:2026-03-18 18:02:28.531911733 +0000 UTC m=+5.051384852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.661745 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f14834675 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:28.547176053 +0000 UTC m=+5.066649172,LastTimestamp:2026-03-18 18:02:28.547176053 +0000 UTC m=+5.066649172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.666255 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f14a05bc9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:28.549082057 +0000 UTC m=+5.068555176,LastTimestamp:2026-03-18 18:02:28.549082057 +0000 UTC m=+5.068555176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.670663 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f24de7727 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:28.821587751 +0000 UTC m=+5.341060840,LastTimestamp:2026-03-18 18:02:28.821587751 +0000 UTC m=+5.341060840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.675675 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f2605f45f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:28.840952927 +0000 UTC m=+5.360426016,LastTimestamp:2026-03-18 18:02:28.840952927 +0000 UTC m=+5.360426016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.680191 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f261fccd9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:28.842646745 +0000 UTC m=+5.362119824,LastTimestamp:2026-03-18 18:02:28.842646745 +0000 UTC m=+5.362119824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.685038 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f342f2777 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:29.078534007 +0000 UTC m=+5.598007086,LastTimestamp:2026-03-18 18:02:29.078534007 +0000 UTC m=+5.598007086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.690001 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f350666f9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:29.092640505 +0000 UTC m=+5.612113584,LastTimestamp:2026-03-18 18:02:29.092640505 +0000 UTC m=+5.612113584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.695550 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f351c9b28 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:29.094095656 +0000 UTC m=+5.613568725,LastTimestamp:2026-03-18 18:02:29.094095656 +0000 UTC m=+5.613568725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.700184 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f4476d36b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:29.351666539 +0000 UTC m=+5.871139658,LastTimestamp:2026-03-18 18:02:29.351666539 +0000 UTC m=+5.871139658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.703581 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f4582b64e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:29.369222734 +0000 UTC m=+5.888695853,LastTimestamp:2026-03-18 18:02:29.369222734 +0000 UTC m=+5.888695853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.707326 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f459d5363 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:29.370966883 +0000 UTC m=+5.890440002,LastTimestamp:2026-03-18 18:02:29.370966883 +0000 UTC m=+5.890440002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.710607 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f52c14f1a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:29.59142889 +0000 UTC m=+6.110901979,LastTimestamp:2026-03-18 18:02:29.59142889 +0000 UTC m=+6.110901979,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.711941 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e017f53cabb3a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:29.60882361 +0000 UTC m=+6.128296699,LastTimestamp:2026-03-18 18:02:29.60882361 +0000 UTC m=+6.128296699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.714729 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 18:02:48 crc kubenswrapper[5008]: &Event{ObjectMeta:{kube-controller-manager-crc.189e0180692398f3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 18:02:48 crc kubenswrapper[5008]: body: Mar 18 18:02:48 crc kubenswrapper[5008]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:34.261936371 +0000 UTC m=+10.781409490,LastTimestamp:2026-03-18 18:02:34.261936371 +0000 UTC m=+10.781409490,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:02:48 crc kubenswrapper[5008]: > Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.717739 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e0180692516e0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:34.262034144 +0000 UTC m=+10.781507263,LastTimestamp:2026-03-18 18:02:34.262034144 +0000 UTC m=+10.781507263,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.723491 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 18:02:48 crc kubenswrapper[5008]: &Event{ObjectMeta:{kube-apiserver-crc.189e0181538d0a36 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 18:02:48 crc kubenswrapper[5008]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 18:02:48 crc kubenswrapper[5008]: Mar 18 18:02:48 crc kubenswrapper[5008]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:38.19471519 +0000 UTC m=+14.714188279,LastTimestamp:2026-03-18 18:02:38.19471519 +0000 UTC m=+14.714188279,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:02:48 crc kubenswrapper[5008]: > Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.727260 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e0181538db51d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:38.194758941 +0000 UTC m=+14.714232020,LastTimestamp:2026-03-18 18:02:38.194758941 +0000 UTC m=+14.714232020,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.730529 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e0181538d0a36\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 18:02:48 crc kubenswrapper[5008]: &Event{ObjectMeta:{kube-apiserver-crc.189e0181538d0a36 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 18:02:48 crc kubenswrapper[5008]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 18:02:48 crc kubenswrapper[5008]: Mar 18 18:02:48 crc kubenswrapper[5008]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:38.19471519 +0000 UTC m=+14.714188279,LastTimestamp:2026-03-18 18:02:38.208152799 +0000 UTC m=+14.727625898,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:02:48 crc kubenswrapper[5008]: > Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.734295 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e0181538db51d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e0181538db51d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:38.194758941 +0000 UTC m=+14.714232020,LastTimestamp:2026-03-18 18:02:38.208222461 +0000 UTC m=+14.727695560,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.737877 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 18:02:48 crc kubenswrapper[5008]: &Event{ObjectMeta:{kube-apiserver-crc.189e01816d8d0711 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 18 18:02:48 crc kubenswrapper[5008]: body: [+]ping ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]log ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]etcd ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-filter ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-apiextensions-informers ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-apiextensions-controllers ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/crd-informer-synced ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-system-namespaces-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 18 18:02:48 crc kubenswrapper[5008]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 18 18:02:48 crc kubenswrapper[5008]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/bootstrap-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/start-kube-aggregator-informers ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/apiservice-registration-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/apiservice-discovery-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]autoregister-completion ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/apiservice-openapi-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 18 18:02:48 crc kubenswrapper[5008]: livez check failed Mar 18 18:02:48 crc kubenswrapper[5008]: Mar 18 18:02:48 crc kubenswrapper[5008]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:38.630922001 +0000 UTC m=+15.150395120,LastTimestamp:2026-03-18 18:02:38.630922001 +0000 UTC m=+15.150395120,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:02:48 crc kubenswrapper[5008]: > Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.742326 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e01816d8ecee6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:38.631038694 +0000 UTC m=+15.150511813,LastTimestamp:2026-03-18 18:02:38.631038694 +0000 UTC m=+15.150511813,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.746299 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e017ebefe0b50\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e017ebefe0b50 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:27.112381264 +0000 UTC m=+3.631854353,LastTimestamp:2026-03-18 18:02:39.317370277 +0000 UTC m=+15.836843356,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.751437 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 18:02:48 crc kubenswrapper[5008]: &Event{ObjectMeta:{kube-controller-manager-crc.189e0182bd3aded0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 18:02:48 crc kubenswrapper[5008]: body: Mar 18 18:02:48 crc kubenswrapper[5008]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:44.26268232 +0000 UTC m=+20.782155439,LastTimestamp:2026-03-18 18:02:44.26268232 +0000 UTC m=+20.782155439,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:02:48 crc kubenswrapper[5008]: > Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.754508 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e0182bd3c42a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:44.262773412 +0000 UTC m=+20.782246521,LastTimestamp:2026-03-18 18:02:44.262773412 +0000 UTC m=+20.782246521,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:48 crc kubenswrapper[5008]: W0318 18:02:48.970245 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 18:02:48 crc kubenswrapper[5008]: E0318 18:02:48.970327 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 18:02:49 crc kubenswrapper[5008]: I0318 18:02:49.125256 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:50 crc kubenswrapper[5008]: I0318 18:02:50.128668 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:50 crc kubenswrapper[5008]: W0318 18:02:50.193253 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 18:02:50 crc kubenswrapper[5008]: E0318 18:02:50.193337 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 18:02:51 crc kubenswrapper[5008]: I0318 18:02:51.123838 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:51 crc kubenswrapper[5008]: I0318 18:02:51.585974 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:51 crc kubenswrapper[5008]: I0318 18:02:51.588134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:51 crc kubenswrapper[5008]: I0318 18:02:51.588214 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:51 crc kubenswrapper[5008]: I0318 18:02:51.588283 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:51 crc kubenswrapper[5008]: I0318 18:02:51.588335 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:02:51 crc kubenswrapper[5008]: E0318 18:02:51.594747 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:02:51 crc kubenswrapper[5008]: E0318 18:02:51.596197 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:02:51 crc kubenswrapper[5008]: W0318 18:02:51.731898 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:51 crc kubenswrapper[5008]: E0318 18:02:51.732396 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 18:02:51 crc kubenswrapper[5008]: W0318 18:02:51.974151 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 18:02:51 crc kubenswrapper[5008]: E0318 18:02:51.974240 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 18:02:52 crc kubenswrapper[5008]: I0318 18:02:52.124445 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:53 crc kubenswrapper[5008]: I0318 18:02:53.125393 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.124545 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.262012 5008 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.262121 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.262187 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.262363 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.263917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.263955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.263966 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.264505 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 18:02:54 crc kubenswrapper[5008]: I0318 18:02:54.264719 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b" gracePeriod=30 Mar 18 18:02:54 crc kubenswrapper[5008]: E0318 18:02:54.267589 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e0180692398f3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 18:02:54 crc kubenswrapper[5008]: &Event{ObjectMeta:{kube-controller-manager-crc.189e0180692398f3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 18:02:54 crc kubenswrapper[5008]: body: Mar 18 18:02:54 crc kubenswrapper[5008]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:34.261936371 +0000 UTC m=+10.781409490,LastTimestamp:2026-03-18 18:02:54.2620887 +0000 UTC m=+30.781561789,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:02:54 crc kubenswrapper[5008]: > Mar 18 18:02:54 crc kubenswrapper[5008]: E0318 18:02:54.275412 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e0180692516e0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e0180692516e0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:34.262034144 +0000 UTC m=+10.781507263,LastTimestamp:2026-03-18 18:02:54.262148422 +0000 UTC m=+30.781621511,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:54 crc kubenswrapper[5008]: E0318 18:02:54.281898 5008 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e018511658a48 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:54.264699464 +0000 UTC m=+30.784172553,LastTimestamp:2026-03-18 18:02:54.264699464 +0000 UTC m=+30.784172553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:54 crc kubenswrapper[5008]: E0318 18:02:54.291323 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:02:54 crc kubenswrapper[5008]: E0318 18:02:54.394400 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e017e5de4479c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e5de4479c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.483302812 +0000 UTC m=+2.002775901,LastTimestamp:2026-03-18 18:02:54.387781965 +0000 UTC m=+30.907255054,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:54 crc kubenswrapper[5008]: E0318 18:02:54.608716 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e017e7306bfe7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e7306bfe7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.837883367 +0000 UTC m=+2.357356486,LastTimestamp:2026-03-18 18:02:54.602447408 +0000 UTC m=+31.121920487,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:54 crc kubenswrapper[5008]: E0318 18:02:54.625382 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e017e73ae9a15\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e017e73ae9a15 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:25.848883733 +0000 UTC m=+2.368356852,LastTimestamp:2026-03-18 18:02:54.617464422 +0000 UTC m=+31.136937501,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:02:55 crc kubenswrapper[5008]: I0318 18:02:55.124986 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:55 crc kubenswrapper[5008]: I0318 18:02:55.387358 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 18:02:55 crc kubenswrapper[5008]: I0318 18:02:55.387953 5008 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b" exitCode=255 Mar 18 18:02:55 crc kubenswrapper[5008]: I0318 18:02:55.388023 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b"} Mar 18 18:02:55 crc kubenswrapper[5008]: I0318 18:02:55.388087 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"af028acb3982bdc511e57661b10a59b3488f1c244edfb1e241d86fc56d05aa4c"} Mar 18 18:02:55 crc kubenswrapper[5008]: I0318 18:02:55.388264 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:55 crc kubenswrapper[5008]: I0318 18:02:55.389870 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:55 crc kubenswrapper[5008]: I0318 18:02:55.389961 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:55 crc kubenswrapper[5008]: I0318 18:02:55.389985 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:56 crc kubenswrapper[5008]: I0318 18:02:56.122933 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:57 crc kubenswrapper[5008]: I0318 18:02:57.123206 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:58 crc kubenswrapper[5008]: I0318 18:02:58.123432 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:58 crc kubenswrapper[5008]: I0318 18:02:58.595215 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:58 crc kubenswrapper[5008]: I0318 18:02:58.597172 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:58 crc kubenswrapper[5008]: I0318 18:02:58.597246 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:58 crc kubenswrapper[5008]: I0318 18:02:58.597270 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:58 crc kubenswrapper[5008]: I0318 18:02:58.597312 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:02:58 crc kubenswrapper[5008]: E0318 18:02:58.604617 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:02:58 crc kubenswrapper[5008]: E0318 18:02:58.604717 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:02:59 crc kubenswrapper[5008]: I0318 18:02:59.125257 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:02:59 crc kubenswrapper[5008]: I0318 18:02:59.198093 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:02:59 crc kubenswrapper[5008]: I0318 18:02:59.199953 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:02:59 crc kubenswrapper[5008]: I0318 18:02:59.200052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:02:59 crc kubenswrapper[5008]: I0318 18:02:59.200076 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:02:59 crc kubenswrapper[5008]: I0318 18:02:59.201126 5008 scope.go:117] "RemoveContainer" containerID="c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e" Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.124190 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.405425 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.406599 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.408957 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ebe9de58f480c74704a28247ecb9619ca763b5b2a8ca6dabea66d7c5da3ef3b7" exitCode=255 Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.409017 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ebe9de58f480c74704a28247ecb9619ca763b5b2a8ca6dabea66d7c5da3ef3b7"} Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.409105 5008 scope.go:117] "RemoveContainer" containerID="c64a3dfda1f490a1cf022dd55f4f56b6bfc5de254a363bbd6238cf97679fc82e" Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.409286 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.410737 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.410784 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.410799 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:00 crc kubenswrapper[5008]: I0318 18:03:00.411719 5008 scope.go:117] "RemoveContainer" containerID="ebe9de58f480c74704a28247ecb9619ca763b5b2a8ca6dabea66d7c5da3ef3b7" Mar 18 18:03:00 crc kubenswrapper[5008]: E0318 18:03:00.411978 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:01 crc kubenswrapper[5008]: I0318 18:03:01.118714 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:01 crc kubenswrapper[5008]: I0318 18:03:01.262140 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:03:01 crc kubenswrapper[5008]: I0318 18:03:01.262875 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:01 crc kubenswrapper[5008]: I0318 18:03:01.264870 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:01 crc kubenswrapper[5008]: I0318 18:03:01.264943 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:01 crc kubenswrapper[5008]: I0318 18:03:01.264963 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:01 crc kubenswrapper[5008]: I0318 18:03:01.415498 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 18:03:02 crc kubenswrapper[5008]: I0318 18:03:02.125925 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:02 crc kubenswrapper[5008]: I0318 18:03:02.464883 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:03:02 crc kubenswrapper[5008]: I0318 18:03:02.465195 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:02 crc kubenswrapper[5008]: I0318 18:03:02.467351 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:02 crc kubenswrapper[5008]: I0318 18:03:02.467410 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:02 crc kubenswrapper[5008]: I0318 18:03:02.467422 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:02 crc kubenswrapper[5008]: I0318 18:03:02.468165 5008 scope.go:117] "RemoveContainer" containerID="ebe9de58f480c74704a28247ecb9619ca763b5b2a8ca6dabea66d7c5da3ef3b7" Mar 18 18:03:02 crc kubenswrapper[5008]: E0318 18:03:02.468405 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:03 crc kubenswrapper[5008]: I0318 18:03:03.123713 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:04 crc kubenswrapper[5008]: I0318 18:03:04.074512 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:03:04 crc kubenswrapper[5008]: I0318 18:03:04.074795 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:04 crc kubenswrapper[5008]: I0318 18:03:04.076671 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:04 crc kubenswrapper[5008]: I0318 18:03:04.076727 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:04 crc kubenswrapper[5008]: I0318 18:03:04.076745 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:04 crc kubenswrapper[5008]: I0318 18:03:04.126033 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:04 crc kubenswrapper[5008]: I0318 18:03:04.262288 5008 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 18:03:04 crc kubenswrapper[5008]: I0318 18:03:04.262376 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 18:03:04 crc kubenswrapper[5008]: E0318 18:03:04.269315 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e0182bd3aded0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 18:03:04 crc kubenswrapper[5008]: &Event{ObjectMeta:{kube-controller-manager-crc.189e0182bd3aded0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 18:03:04 crc kubenswrapper[5008]: body: Mar 18 18:03:04 crc kubenswrapper[5008]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:44.26268232 +0000 UTC m=+20.782155439,LastTimestamp:2026-03-18 18:03:04.262352223 +0000 UTC m=+40.781825302,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 18:03:04 crc kubenswrapper[5008]: > Mar 18 18:03:04 crc kubenswrapper[5008]: E0318 18:03:04.275864 5008 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e0182bd3c42a4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e0182bd3c42a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:02:44.262773412 +0000 UTC m=+20.782246521,LastTimestamp:2026-03-18 18:03:04.262407564 +0000 UTC m=+40.781880643,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:03:04 crc kubenswrapper[5008]: E0318 18:03:04.626667 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.098305 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.098642 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.100671 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.100733 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.100746 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.101624 5008 scope.go:117] "RemoveContainer" containerID="ebe9de58f480c74704a28247ecb9619ca763b5b2a8ca6dabea66d7c5da3ef3b7" Mar 18 18:03:05 crc kubenswrapper[5008]: E0318 18:03:05.101834 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.125503 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.605002 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.606986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.607100 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.607141 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:05 crc kubenswrapper[5008]: I0318 18:03:05.607199 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:05 crc kubenswrapper[5008]: E0318 18:03:05.612783 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:05 crc kubenswrapper[5008]: E0318 18:03:05.613081 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:06 crc kubenswrapper[5008]: I0318 18:03:06.123594 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:07 crc kubenswrapper[5008]: W0318 18:03:07.112773 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:07 crc kubenswrapper[5008]: E0318 18:03:07.112888 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:07 crc kubenswrapper[5008]: I0318 18:03:07.123970 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:07 crc kubenswrapper[5008]: W0318 18:03:07.372249 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 18:03:07 crc kubenswrapper[5008]: E0318 18:03:07.372393 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:07 crc kubenswrapper[5008]: W0318 18:03:07.910105 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 18:03:07 crc kubenswrapper[5008]: E0318 18:03:07.910176 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:08 crc kubenswrapper[5008]: I0318 18:03:08.120891 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:09 crc kubenswrapper[5008]: I0318 18:03:09.122798 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:10 crc kubenswrapper[5008]: I0318 18:03:10.123308 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.128072 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.266954 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.267125 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.270484 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.270517 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.270605 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.271936 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.653226 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.654850 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.654932 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:11 crc kubenswrapper[5008]: I0318 18:03:11.654956 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:12 crc kubenswrapper[5008]: I0318 18:03:12.123649 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:12 crc kubenswrapper[5008]: I0318 18:03:12.613925 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:12 crc kubenswrapper[5008]: I0318 18:03:12.615973 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:12 crc kubenswrapper[5008]: I0318 18:03:12.616039 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:12 crc kubenswrapper[5008]: I0318 18:03:12.616063 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:12 crc kubenswrapper[5008]: I0318 18:03:12.616112 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:12 crc kubenswrapper[5008]: E0318 18:03:12.623886 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:12 crc kubenswrapper[5008]: E0318 18:03:12.624231 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:12 crc kubenswrapper[5008]: W0318 18:03:12.875986 5008 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 18:03:12 crc kubenswrapper[5008]: E0318 18:03:12.876088 5008 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 18:03:13 crc kubenswrapper[5008]: I0318 18:03:13.123210 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:14 crc kubenswrapper[5008]: I0318 18:03:14.125026 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:14 crc kubenswrapper[5008]: E0318 18:03:14.627386 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:03:15 crc kubenswrapper[5008]: I0318 18:03:15.122364 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:16 crc kubenswrapper[5008]: I0318 18:03:16.121278 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:17 crc kubenswrapper[5008]: I0318 18:03:17.122965 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.120721 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.197483 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.199271 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.199387 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.199452 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.200120 5008 scope.go:117] "RemoveContainer" containerID="ebe9de58f480c74704a28247ecb9619ca763b5b2a8ca6dabea66d7c5da3ef3b7" Mar 18 18:03:18 crc kubenswrapper[5008]: E0318 18:03:18.200495 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.797097 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.797301 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.798771 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.798849 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:18 crc kubenswrapper[5008]: I0318 18:03:18.798866 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:19 crc kubenswrapper[5008]: I0318 18:03:19.123502 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:19 crc kubenswrapper[5008]: I0318 18:03:19.624453 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:19 crc kubenswrapper[5008]: I0318 18:03:19.626185 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:19 crc kubenswrapper[5008]: I0318 18:03:19.626301 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:19 crc kubenswrapper[5008]: I0318 18:03:19.626381 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:19 crc kubenswrapper[5008]: I0318 18:03:19.626477 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:19 crc kubenswrapper[5008]: E0318 18:03:19.629786 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:19 crc kubenswrapper[5008]: E0318 18:03:19.630085 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:20 crc kubenswrapper[5008]: I0318 18:03:20.121401 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:21 crc kubenswrapper[5008]: I0318 18:03:21.123450 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:22 crc kubenswrapper[5008]: I0318 18:03:22.121499 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:23 crc kubenswrapper[5008]: I0318 18:03:23.124442 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:24 crc kubenswrapper[5008]: I0318 18:03:24.124758 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:24 crc kubenswrapper[5008]: E0318 18:03:24.628328 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:03:25 crc kubenswrapper[5008]: I0318 18:03:25.123976 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:26 crc kubenswrapper[5008]: I0318 18:03:26.122114 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:26 crc kubenswrapper[5008]: I0318 18:03:26.629892 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:26 crc kubenswrapper[5008]: I0318 18:03:26.631957 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:26 crc kubenswrapper[5008]: I0318 18:03:26.632019 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:26 crc kubenswrapper[5008]: I0318 18:03:26.632038 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:26 crc kubenswrapper[5008]: I0318 18:03:26.632073 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:26 crc kubenswrapper[5008]: E0318 18:03:26.636408 5008 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 18:03:26 crc kubenswrapper[5008]: E0318 18:03:26.636827 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 18:03:27 crc kubenswrapper[5008]: I0318 18:03:27.124727 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:28 crc kubenswrapper[5008]: I0318 18:03:28.123392 5008 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 18:03:28 crc kubenswrapper[5008]: I0318 18:03:28.811538 5008 csr.go:261] certificate signing request csr-jx6s4 is approved, waiting to be issued Mar 18 18:03:28 crc kubenswrapper[5008]: I0318 18:03:28.818976 5008 csr.go:257] certificate signing request csr-jx6s4 is issued Mar 18 18:03:28 crc kubenswrapper[5008]: I0318 18:03:28.873994 5008 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 18:03:28 crc kubenswrapper[5008]: I0318 18:03:28.971495 5008 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 18:03:29 crc kubenswrapper[5008]: I0318 18:03:29.820918 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-19 14:20:15.653270986 +0000 UTC Mar 18 18:03:29 crc kubenswrapper[5008]: I0318 18:03:29.822139 5008 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7364h16m45.831144684s for next certificate rotation Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.198184 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.199883 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.199911 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.199919 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.200408 5008 scope.go:117] "RemoveContainer" containerID="ebe9de58f480c74704a28247ecb9619ca763b5b2a8ca6dabea66d7c5da3ef3b7" Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.712498 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.714519 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960"} Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.714800 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.716150 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.716183 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:32 crc kubenswrapper[5008]: I0318 18:03:32.716195 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.637615 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.639541 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.639608 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.639622 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.639753 5008 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.649644 5008 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.650239 5008 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.650288 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.654373 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.654417 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.654426 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.654443 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.654457 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:33Z","lastTransitionTime":"2026-03-18T18:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.666670 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.672906 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.672935 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.672943 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.672956 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.672968 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:33Z","lastTransitionTime":"2026-03-18T18:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.681284 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.687384 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.687408 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.687416 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.687428 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.687437 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:33Z","lastTransitionTime":"2026-03-18T18:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.695928 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.701927 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.701990 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.702000 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.702016 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.702026 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:33Z","lastTransitionTime":"2026-03-18T18:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.710731 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.710877 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.710897 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.718567 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.719022 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.720776 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960" exitCode=255 Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.720807 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960"} Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.720842 5008 scope.go:117] "RemoveContainer" containerID="ebe9de58f480c74704a28247ecb9619ca763b5b2a8ca6dabea66d7c5da3ef3b7" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.720996 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.721743 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.721766 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.721775 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:33 crc kubenswrapper[5008]: I0318 18:03:33.722237 5008 scope.go:117] "RemoveContainer" containerID="1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960" Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.722381 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.811253 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:33 crc kubenswrapper[5008]: E0318 18:03:33.912292 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.012904 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.113026 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.213446 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.314369 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.415221 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.515512 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.616250 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.629179 5008 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.716624 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: I0318 18:03:34.725794 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.817085 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:34 crc kubenswrapper[5008]: E0318 18:03:34.917919 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:35 crc kubenswrapper[5008]: E0318 18:03:35.018742 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.099190 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.099521 5008 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.101280 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.101352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.101377 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.102309 5008 scope.go:117] "RemoveContainer" containerID="1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960" Mar 18 18:03:35 crc kubenswrapper[5008]: E0318 18:03:35.102611 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:35 crc kubenswrapper[5008]: E0318 18:03:35.119802 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:35 crc kubenswrapper[5008]: E0318 18:03:35.219995 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:35 crc kubenswrapper[5008]: E0318 18:03:35.320286 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:35 crc kubenswrapper[5008]: E0318 18:03:35.420701 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:35 crc kubenswrapper[5008]: E0318 18:03:35.521727 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:35 crc kubenswrapper[5008]: E0318 18:03:35.622221 5008 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.648537 5008 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.724512 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.724592 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.724609 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.724632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.724651 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:35Z","lastTransitionTime":"2026-03-18T18:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.826960 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.827010 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.827022 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.827039 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.827053 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:35Z","lastTransitionTime":"2026-03-18T18:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.929643 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.929939 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.930134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.930346 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:35 crc kubenswrapper[5008]: I0318 18:03:35.930626 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:35Z","lastTransitionTime":"2026-03-18T18:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.033651 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.034082 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.034334 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.034492 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.034684 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.138071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.138485 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.138728 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.138933 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.139110 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.142253 5008 apiserver.go:52] "Watching apiserver" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.147614 5008 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.147852 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.148303 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.148358 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.148519 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.148857 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.148886 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.148944 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.148970 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.149157 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.151518 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.156864 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.156907 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.156917 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.157030 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.157752 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.158906 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.159147 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.159961 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.160378 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.175828 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.189803 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.208221 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.220784 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.222931 5008 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.225206 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.225471 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.225541 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.225592 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.225615 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.225673 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.225749 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:03:36.725706109 +0000 UTC m=+73.245179248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.225762 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.226013 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.226452 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.226509 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.226832 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.226881 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.226928 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.226954 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.227920 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.228402 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.228471 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.228490 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.228909 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.228508 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229066 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229087 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229128 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229148 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229165 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229205 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229225 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229246 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229286 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229306 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229323 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229339 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229380 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229401 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229419 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229513 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229548 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229636 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229881 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.229909 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230075 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230189 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230296 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230338 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230357 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230439 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230634 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230643 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230669 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230684 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230837 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.230962 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231006 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231052 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231084 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231117 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231147 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231178 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231199 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231210 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231244 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231276 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231287 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231309 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231347 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231380 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231412 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231444 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231476 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231507 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231540 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231604 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231642 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231678 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231709 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231740 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231769 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231803 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231839 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231870 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231902 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231934 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231967 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232001 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232037 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232073 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232105 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232140 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231307 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231401 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231455 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231621 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231897 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231937 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.231976 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232539 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232066 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232278 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232348 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232181 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232727 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232762 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232796 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232826 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232862 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232878 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232893 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232932 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232962 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.232996 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233025 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233058 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233057 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233088 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233110 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233116 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233124 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233148 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233157 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233171 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233189 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233220 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233252 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233284 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233316 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233346 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233381 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233412 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233443 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233477 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233511 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234301 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234388 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234422 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234459 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234491 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234523 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235007 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235045 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235077 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235111 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235143 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235183 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235216 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235249 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235288 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235322 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233369 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233410 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235357 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235396 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235428 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235460 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235492 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235527 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235568 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235660 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235697 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235729 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235760 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235795 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235828 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235862 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235896 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235928 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235959 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235991 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236021 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236060 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236092 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236126 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236159 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236191 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236228 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236303 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236337 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236408 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236551 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236645 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236699 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236747 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236804 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236860 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236894 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236929 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236963 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236998 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237033 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237085 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237149 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237200 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237249 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237287 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237320 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237353 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237389 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237448 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237503 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237540 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237610 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237647 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237684 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237717 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237750 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237784 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237818 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237852 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237890 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237929 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238074 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238115 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238148 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238183 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238216 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238249 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238283 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238317 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238418 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238465 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238499 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238534 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238617 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238776 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238818 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238856 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238891 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239280 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239373 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239420 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239464 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239595 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239616 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239692 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239777 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239853 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239936 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.240057 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.240152 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.240236 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233530 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233656 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233826 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233908 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.233932 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234144 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234654 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234709 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.234994 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235056 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235064 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235067 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235105 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235330 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235334 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235590 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235597 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235615 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235424 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235671 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.235828 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236084 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236050 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236170 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236213 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236773 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236891 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236929 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236984 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237360 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237816 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237829 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.240773 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237849 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.237940 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238339 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238442 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238484 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238492 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238639 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238547 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238734 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238756 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238691 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.238640 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.239609 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.240773 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.241379 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.241873 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.242050 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.242177 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.242178 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.242257 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.242474 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.242988 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.243066 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.243257 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.243514 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.243543 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.243545 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.243557 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.244035 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.244187 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246149 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246093 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246238 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246249 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246511 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246661 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246843 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246868 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246955 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.246953 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.247246 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.247276 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.236779 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.248399 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.251004 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.251696 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.251755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.251769 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.251785 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.251796 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.252056 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:36.751909809 +0000 UTC m=+73.271382908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.254038 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.254474 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.254617 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.240324 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.254722 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.254773 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255023 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255035 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255095 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255139 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255174 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255207 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255205 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255233 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255260 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255286 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255312 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255388 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255589 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255952 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.255972 5008 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256002 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256016 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256029 5008 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256042 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256054 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256067 5008 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256079 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256094 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256109 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256127 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256144 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256160 5008 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256176 5008 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256172 5008 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256192 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256210 5008 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256211 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.256280 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.256342 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:36.756325956 +0000 UTC m=+73.275799055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256229 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256372 5008 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256388 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256403 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256418 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256433 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256447 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256460 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256473 5008 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256486 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256499 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256511 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256524 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256537 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256557 5008 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256591 5008 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256604 5008 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256667 5008 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256687 5008 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256761 5008 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256774 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256788 5008 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256801 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256842 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256859 5008 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256872 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256886 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256920 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.256927 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.257095 5008 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.257841 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.258504 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.258567 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.258584 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.258992 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.259279 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.259463 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.259767 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.260048 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.260145 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.260375 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.260419 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.260451 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.260618 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:36.760465964 +0000 UTC m=+73.279939154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.260653 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.262449 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.262516 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.262657 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.262749 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.266988 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.267029 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.267044 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.267107 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:36.767085439 +0000 UTC m=+73.286558568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.267710 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.267738 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.269095 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.269268 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.269342 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.272520 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.272639 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.272707 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.272844 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.273167 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.273179 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.274013 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.274056 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.274174 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.274538 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.276932 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.277092 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.277180 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.277381 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.277383 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.277411 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.277955 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.278141 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.278303 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.278718 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.278808 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.279569 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.279591 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.279607 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.279623 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.279636 5008 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.279649 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.278813 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.278877 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.281741 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.281798 5008 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.281842 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.281865 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.281887 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.281913 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.281952 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.281973 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.281992 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282018 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282036 5008 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282056 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282077 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282104 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282124 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282142 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282161 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282186 5008 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282204 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282225 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282244 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282272 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282290 5008 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282309 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282341 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282360 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282379 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282399 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282424 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282457 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282476 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282496 5008 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282521 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282544 5008 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282570 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282619 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282638 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282656 5008 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282674 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282699 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282718 5008 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282737 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282758 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282783 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282803 5008 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282822 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282841 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282867 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282887 5008 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282907 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282932 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282938 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282952 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282970 5008 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.282988 5008 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283014 5008 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283036 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283056 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283075 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283101 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283119 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283138 5008 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283157 5008 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283182 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283201 5008 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283220 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283245 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.283264 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.284894 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.285375 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.285955 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.286545 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.286565 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.287123 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.287191 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.287402 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.287545 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.288055 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.288453 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.288461 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.288728 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.288868 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.289190 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.289202 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.289672 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.289831 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.291189 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.291765 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.294349 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.294977 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.290803 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.295131 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.295502 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.295444 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.297010 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.301925 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.302075 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.302074 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.302280 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.302523 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.302623 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.304331 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.309741 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.313409 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.324316 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.326783 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.332704 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.344030 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.355024 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.355077 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.355095 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.355119 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.355138 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385179 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385261 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385358 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385377 5008 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385391 5008 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385402 5008 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385416 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385430 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385443 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385456 5008 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385469 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385481 5008 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385493 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385504 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385517 5008 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385529 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385543 5008 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385559 5008 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385609 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385623 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385634 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385646 5008 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385657 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385667 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385679 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385689 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385699 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385709 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385720 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385731 5008 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385742 5008 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385753 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385764 5008 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385776 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385787 5008 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385800 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385811 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385827 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385820 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385840 5008 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385933 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385931 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.385955 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386026 5008 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386049 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386070 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386100 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386128 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386151 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386172 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386190 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386213 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386240 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386262 5008 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386282 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386301 5008 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386320 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386338 5008 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386359 5008 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386378 5008 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386396 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386413 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386431 5008 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386449 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386467 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386488 5008 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386512 5008 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386537 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386646 5008 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386678 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386704 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386726 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386745 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386763 5008 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.386782 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.457938 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.458013 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.458036 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.458067 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.458090 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.469657 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.475507 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.484740 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.494516 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:36 crc kubenswrapper[5008]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:03:36 crc kubenswrapper[5008]: if [[ -f "/env/_master" ]]; then Mar 18 18:03:36 crc kubenswrapper[5008]: set -o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: source "/env/_master" Mar 18 18:03:36 crc kubenswrapper[5008]: set +o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: fi Mar 18 18:03:36 crc kubenswrapper[5008]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 18:03:36 crc kubenswrapper[5008]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 18:03:36 crc kubenswrapper[5008]: ho_enable="--enable-hybrid-overlay" Mar 18 18:03:36 crc kubenswrapper[5008]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 18:03:36 crc kubenswrapper[5008]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 18:03:36 crc kubenswrapper[5008]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 18:03:36 crc kubenswrapper[5008]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 18:03:36 crc kubenswrapper[5008]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 18:03:36 crc kubenswrapper[5008]: --webhook-host=127.0.0.1 \ Mar 18 18:03:36 crc kubenswrapper[5008]: --webhook-port=9743 \ Mar 18 18:03:36 crc kubenswrapper[5008]: ${ho_enable} \ Mar 18 18:03:36 crc kubenswrapper[5008]: --enable-interconnect \ Mar 18 18:03:36 crc kubenswrapper[5008]: --disable-approver \ Mar 18 18:03:36 crc kubenswrapper[5008]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 18:03:36 crc kubenswrapper[5008]: --wait-for-kubernetes-api=200s \ Mar 18 18:03:36 crc kubenswrapper[5008]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 18:03:36 crc kubenswrapper[5008]: --loglevel="${LOGLEVEL}" Mar 18 18:03:36 crc kubenswrapper[5008]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:36 crc kubenswrapper[5008]: > logger="UnhandledError" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.495664 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:36 crc kubenswrapper[5008]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 18:03:36 crc kubenswrapper[5008]: set -o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 18:03:36 crc kubenswrapper[5008]: source /etc/kubernetes/apiserver-url.env Mar 18 18:03:36 crc kubenswrapper[5008]: else Mar 18 18:03:36 crc kubenswrapper[5008]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 18:03:36 crc kubenswrapper[5008]: exit 1 Mar 18 18:03:36 crc kubenswrapper[5008]: fi Mar 18 18:03:36 crc kubenswrapper[5008]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 18:03:36 crc kubenswrapper[5008]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:36 crc kubenswrapper[5008]: > logger="UnhandledError" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.496842 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.496897 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:36 crc kubenswrapper[5008]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:03:36 crc kubenswrapper[5008]: if [[ -f "/env/_master" ]]; then Mar 18 18:03:36 crc kubenswrapper[5008]: set -o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: source "/env/_master" Mar 18 18:03:36 crc kubenswrapper[5008]: set +o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: fi Mar 18 18:03:36 crc kubenswrapper[5008]: Mar 18 18:03:36 crc kubenswrapper[5008]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 18:03:36 crc kubenswrapper[5008]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 18:03:36 crc kubenswrapper[5008]: --disable-webhook \ Mar 18 18:03:36 crc kubenswrapper[5008]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 18:03:36 crc kubenswrapper[5008]: --loglevel="${LOGLEVEL}" Mar 18 18:03:36 crc kubenswrapper[5008]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:36 crc kubenswrapper[5008]: > logger="UnhandledError" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.498250 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 18:03:36 crc kubenswrapper[5008]: W0318 18:03:36.509644 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-13491fb60e4fbfef8343753b3d9ae5baaa6c76d4db2c25fe37a72870bc0f171e WatchSource:0}: Error finding container 13491fb60e4fbfef8343753b3d9ae5baaa6c76d4db2c25fe37a72870bc0f171e: Status 404 returned error can't find the container with id 13491fb60e4fbfef8343753b3d9ae5baaa6c76d4db2c25fe37a72870bc0f171e Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.512458 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.513644 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.561195 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.561241 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.561253 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.561272 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.561288 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.665490 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.665535 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.665545 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.665560 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.665586 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.734528 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8e58a215104da56fef962aac2da0ff7ae6d0a88384cf56f33fac32a6cf659a4d"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.736062 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"13491fb60e4fbfef8343753b3d9ae5baaa6c76d4db2c25fe37a72870bc0f171e"} Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.736210 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:36 crc kubenswrapper[5008]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 18:03:36 crc kubenswrapper[5008]: set -o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 18:03:36 crc kubenswrapper[5008]: source /etc/kubernetes/apiserver-url.env Mar 18 18:03:36 crc kubenswrapper[5008]: else Mar 18 18:03:36 crc kubenswrapper[5008]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 18:03:36 crc kubenswrapper[5008]: exit 1 Mar 18 18:03:36 crc kubenswrapper[5008]: fi Mar 18 18:03:36 crc kubenswrapper[5008]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 18:03:36 crc kubenswrapper[5008]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:36 crc kubenswrapper[5008]: > logger="UnhandledError" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.737206 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.737269 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.738481 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.738611 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cd4a072bc342ef64da6526be5fbeff5f79d2433294248de5eeb06e1e7ede3b95"} Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.740089 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:36 crc kubenswrapper[5008]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:03:36 crc kubenswrapper[5008]: if [[ -f "/env/_master" ]]; then Mar 18 18:03:36 crc kubenswrapper[5008]: set -o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: source "/env/_master" Mar 18 18:03:36 crc kubenswrapper[5008]: set +o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: fi Mar 18 18:03:36 crc kubenswrapper[5008]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 18:03:36 crc kubenswrapper[5008]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 18:03:36 crc kubenswrapper[5008]: ho_enable="--enable-hybrid-overlay" Mar 18 18:03:36 crc kubenswrapper[5008]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 18:03:36 crc kubenswrapper[5008]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 18:03:36 crc kubenswrapper[5008]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 18:03:36 crc kubenswrapper[5008]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 18:03:36 crc kubenswrapper[5008]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 18:03:36 crc kubenswrapper[5008]: --webhook-host=127.0.0.1 \ Mar 18 18:03:36 crc kubenswrapper[5008]: --webhook-port=9743 \ Mar 18 18:03:36 crc kubenswrapper[5008]: ${ho_enable} \ Mar 18 18:03:36 crc kubenswrapper[5008]: --enable-interconnect \ Mar 18 18:03:36 crc kubenswrapper[5008]: --disable-approver \ Mar 18 18:03:36 crc kubenswrapper[5008]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 18:03:36 crc kubenswrapper[5008]: --wait-for-kubernetes-api=200s \ Mar 18 18:03:36 crc kubenswrapper[5008]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 18:03:36 crc kubenswrapper[5008]: --loglevel="${LOGLEVEL}" Mar 18 18:03:36 crc kubenswrapper[5008]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:36 crc kubenswrapper[5008]: > logger="UnhandledError" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.742068 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:03:36 crc kubenswrapper[5008]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 18:03:36 crc kubenswrapper[5008]: if [[ -f "/env/_master" ]]; then Mar 18 18:03:36 crc kubenswrapper[5008]: set -o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: source "/env/_master" Mar 18 18:03:36 crc kubenswrapper[5008]: set +o allexport Mar 18 18:03:36 crc kubenswrapper[5008]: fi Mar 18 18:03:36 crc kubenswrapper[5008]: Mar 18 18:03:36 crc kubenswrapper[5008]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 18:03:36 crc kubenswrapper[5008]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 18:03:36 crc kubenswrapper[5008]: --disable-webhook \ Mar 18 18:03:36 crc kubenswrapper[5008]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 18:03:36 crc kubenswrapper[5008]: --loglevel="${LOGLEVEL}" Mar 18 18:03:36 crc kubenswrapper[5008]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 18:03:36 crc kubenswrapper[5008]: > logger="UnhandledError" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.743434 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.748337 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.762153 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.769295 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.769369 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.769387 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.769408 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.769455 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.778371 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.787676 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.791414 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.791700 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:03:37.79160087 +0000 UTC m=+74.311073989 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.791955 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.792024 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.792055 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792087 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792158 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792187 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:37.792170445 +0000 UTC m=+74.311643564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792204 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792224 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:37.792203046 +0000 UTC m=+74.311676205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792226 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792258 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792274 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792289 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792302 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792333 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:37.792319649 +0000 UTC m=+74.311792738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:36 crc kubenswrapper[5008]: E0318 18:03:36.792356 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:37.79234775 +0000 UTC m=+74.311820849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.792095 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.797550 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.811806 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.827361 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.836193 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.845829 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.855326 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.865702 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.872370 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.872402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.872412 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.872426 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.872436 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.876426 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.974223 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.974276 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.974285 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.974307 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:36 crc kubenswrapper[5008]: I0318 18:03:36.974318 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:36Z","lastTransitionTime":"2026-03-18T18:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.077614 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.077673 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.077690 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.077711 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.077724 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:37Z","lastTransitionTime":"2026-03-18T18:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.180257 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.180296 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.180306 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.180320 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.180332 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:37Z","lastTransitionTime":"2026-03-18T18:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.282280 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.282469 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.282488 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.282501 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.282509 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:37Z","lastTransitionTime":"2026-03-18T18:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.385042 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.385111 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.385136 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.385208 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.385234 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:37Z","lastTransitionTime":"2026-03-18T18:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.489113 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.489201 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.489225 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.489260 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.489282 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:37Z","lastTransitionTime":"2026-03-18T18:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.591680 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.591742 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.591756 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.591778 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.591793 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:37Z","lastTransitionTime":"2026-03-18T18:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.694182 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.694214 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.694222 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.694237 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.694247 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:37Z","lastTransitionTime":"2026-03-18T18:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.800764 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.800857 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.800876 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.800895 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.800913 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:37Z","lastTransitionTime":"2026-03-18T18:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.802308 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.802389 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.802416 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.802445 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:03:39.802415037 +0000 UTC m=+76.321888136 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.802516 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.802581 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:39.802552201 +0000 UTC m=+76.322025280 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.802692 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.802720 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.802736 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.802781 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:39.802766957 +0000 UTC m=+76.322240046 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.802895 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.802925 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.802966 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.802993 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:39.802986452 +0000 UTC m=+76.322459531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.803049 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.803060 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.803071 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:37 crc kubenswrapper[5008]: E0318 18:03:37.803092 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:39.803085775 +0000 UTC m=+76.322558854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.904067 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.904111 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.904124 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.904143 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:37 crc kubenswrapper[5008]: I0318 18:03:37.904158 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:37Z","lastTransitionTime":"2026-03-18T18:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.006128 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.006161 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.006172 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.006188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.006199 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.108976 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.109041 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.109058 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.109083 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.109107 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.197317 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.197407 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.197480 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:38 crc kubenswrapper[5008]: E0318 18:03:38.197648 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:38 crc kubenswrapper[5008]: E0318 18:03:38.197820 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:38 crc kubenswrapper[5008]: E0318 18:03:38.198194 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.205461 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.206185 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.208527 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.209488 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.211352 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.212181 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.212735 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.213173 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.213324 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.214110 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.214406 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.214644 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.214939 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.216272 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.217847 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.218966 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.220800 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.221646 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.222656 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.224224 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.225160 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.226656 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.227462 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.228407 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.229937 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.230545 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.231811 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.232360 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.233768 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.234345 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.235145 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.236517 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.237141 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.238488 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.239156 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.240167 5008 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.240338 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.242516 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.243700 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.244229 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.246336 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.247216 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.248361 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.249158 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.250602 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.251169 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.252869 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.255385 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.256856 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.259020 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.260350 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.262090 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.263093 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.264154 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.264738 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.265250 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.266255 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.266872 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.268340 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.316690 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.316726 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.316735 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.316748 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.316757 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.420273 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.420338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.420362 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.420392 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.420415 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.523204 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.523250 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.523261 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.523279 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.523292 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.626761 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.626821 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.626843 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.626871 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.626892 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.729376 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.729438 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.729465 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.729497 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.729522 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.832630 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.832678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.832688 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.832706 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.832717 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.936029 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.936074 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.936086 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.936104 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:38 crc kubenswrapper[5008]: I0318 18:03:38.936118 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:38Z","lastTransitionTime":"2026-03-18T18:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.038696 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.038763 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.038787 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.038819 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.038841 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.142406 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.142447 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.142458 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.142474 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.142486 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.245262 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.245314 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.245327 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.245343 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.245353 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.347776 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.347818 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.347829 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.347846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.347859 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.451841 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.451919 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.451943 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.451972 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.451992 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.555161 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.555239 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.555263 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.555296 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.555317 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.658781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.658844 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.658861 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.658888 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.658907 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.762115 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.762180 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.762201 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.762228 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.762249 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.824954 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.825072 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825152 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:03:43.825069277 +0000 UTC m=+80.344542386 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.825212 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.825260 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825277 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.825300 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825318 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825340 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825380 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825413 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:43.825389955 +0000 UTC m=+80.344863064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825421 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825452 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:43.825435167 +0000 UTC m=+80.344908286 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825519 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:43.825492378 +0000 UTC m=+80.344965487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825619 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825664 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825682 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:39 crc kubenswrapper[5008]: E0318 18:03:39.825750 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:43.825731774 +0000 UTC m=+80.345204883 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.865817 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.865875 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.865894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.865917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.865936 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.969146 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.969226 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.969248 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.969353 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:39 crc kubenswrapper[5008]: I0318 18:03:39.969382 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:39Z","lastTransitionTime":"2026-03-18T18:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.071608 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.071939 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.072121 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.072303 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.072457 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:40Z","lastTransitionTime":"2026-03-18T18:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.175317 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.175390 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.175415 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.175444 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.175503 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:40Z","lastTransitionTime":"2026-03-18T18:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.198138 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.198177 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:40 crc kubenswrapper[5008]: E0318 18:03:40.198394 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.198449 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:40 crc kubenswrapper[5008]: E0318 18:03:40.198624 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:40 crc kubenswrapper[5008]: E0318 18:03:40.198783 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.279188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.279246 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.279263 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.279285 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.279303 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:40Z","lastTransitionTime":"2026-03-18T18:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.381784 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.381836 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.381854 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.381877 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.381894 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:40Z","lastTransitionTime":"2026-03-18T18:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.485552 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.485651 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.485670 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.485692 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.485710 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:40Z","lastTransitionTime":"2026-03-18T18:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.589376 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.589444 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.589462 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.589486 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.589505 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:40Z","lastTransitionTime":"2026-03-18T18:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.692890 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.692951 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.692968 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.692993 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.693010 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:40Z","lastTransitionTime":"2026-03-18T18:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.795590 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.795648 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.795665 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.795740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.795758 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:40Z","lastTransitionTime":"2026-03-18T18:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.898894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.898925 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.898952 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.898965 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:40 crc kubenswrapper[5008]: I0318 18:03:40.898973 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:40Z","lastTransitionTime":"2026-03-18T18:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.002241 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.002312 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.002338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.002368 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.002389 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.105080 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.105122 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.105138 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.105161 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.105178 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.207546 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.207651 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.207677 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.207705 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.207728 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.310511 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.310607 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.310626 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.310650 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.310666 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.413670 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.413724 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.413742 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.413768 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.413789 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.516767 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.516822 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.516840 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.516864 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.516882 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.620609 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.620669 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.620703 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.620734 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.620757 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.724392 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.724447 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.724464 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.724486 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.724502 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.827618 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.827682 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.827700 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.827725 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.827742 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.931297 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.931364 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.931380 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.931407 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:41 crc kubenswrapper[5008]: I0318 18:03:41.931425 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:41Z","lastTransitionTime":"2026-03-18T18:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.034739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.034787 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.034796 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.034814 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.034823 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.138196 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.138252 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.138269 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.138295 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.138313 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.198811 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.199044 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.199126 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:42 crc kubenswrapper[5008]: E0318 18:03:42.199276 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:42 crc kubenswrapper[5008]: E0318 18:03:42.199490 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:42 crc kubenswrapper[5008]: E0318 18:03:42.199683 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.219432 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.241890 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.241942 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.241965 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.242000 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.242025 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.344958 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.345044 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.345067 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.345135 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.345161 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.447919 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.447986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.447997 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.448010 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.448024 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.465147 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.477950 5008 scope.go:117] "RemoveContainer" containerID="1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960" Mar 18 18:03:42 crc kubenswrapper[5008]: E0318 18:03:42.478139 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.478845 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.550280 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.550645 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.550670 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.550694 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.550712 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.654027 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.654097 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.654114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.654138 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.654156 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.757740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.757870 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.757893 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.757918 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.757936 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.758910 5008 scope.go:117] "RemoveContainer" containerID="1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960" Mar 18 18:03:42 crc kubenswrapper[5008]: E0318 18:03:42.759370 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.861263 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.861322 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.861340 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.861363 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.861381 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.964370 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.964438 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.964455 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.964479 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:42 crc kubenswrapper[5008]: I0318 18:03:42.964497 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:42Z","lastTransitionTime":"2026-03-18T18:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.067998 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.068073 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.068094 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.068122 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.068141 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.171462 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.171533 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.171600 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.171630 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.171650 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.274893 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.274960 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.274977 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.275001 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.275019 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.378390 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.378472 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.378484 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.378499 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.378509 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.481489 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.481594 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.481616 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.481641 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.481658 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.584411 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.584476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.584502 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.584532 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.584600 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.687675 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.687740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.687759 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.687786 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.687807 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.790402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.790474 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.790490 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.790515 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.790533 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.862664 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.862764 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.862798 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.862823 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.862851 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.862978 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.862995 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:03:51.862955172 +0000 UTC m=+88.382428291 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863009 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863033 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863088 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863114 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863055 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:51.863031794 +0000 UTC m=+88.382504973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863237 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863260 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:51.863190188 +0000 UTC m=+88.382663317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863293 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863316 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863322 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:51.863298521 +0000 UTC m=+88.382771760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:43 crc kubenswrapper[5008]: E0318 18:03:43.863409 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:03:51.863379783 +0000 UTC m=+88.382852902 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.894228 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.894273 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.894288 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.894308 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.894322 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.997876 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.997922 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.997934 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.997951 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:43 crc kubenswrapper[5008]: I0318 18:03:43.997963 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:43Z","lastTransitionTime":"2026-03-18T18:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.043684 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.043832 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.043852 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.043881 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.043898 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: E0318 18:03:44.060707 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.066204 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.066265 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.066282 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.066307 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.066325 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: E0318 18:03:44.080794 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.085779 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.085876 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.085894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.085919 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.085936 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: E0318 18:03:44.101800 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.107055 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.107128 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.107148 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.107169 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.107182 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: E0318 18:03:44.123154 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.132194 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.132347 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.132373 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.132409 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.132447 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: E0318 18:03:44.152515 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: E0318 18:03:44.152835 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.155476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.155538 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.155579 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.155607 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.155625 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.198257 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.198451 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.198635 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:44 crc kubenswrapper[5008]: E0318 18:03:44.198513 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:44 crc kubenswrapper[5008]: E0318 18:03:44.198791 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:44 crc kubenswrapper[5008]: E0318 18:03:44.198993 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.227479 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.246206 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.258906 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.258989 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.259010 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.259038 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.259060 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.262798 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.277011 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.293907 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.310043 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.324538 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.340768 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.363081 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.363160 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.363181 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.363212 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.363242 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.466762 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.466833 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.466850 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.466875 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.466896 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.569906 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.569966 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.569983 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.570006 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.570025 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.674152 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.674239 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.674267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.674347 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.674375 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.777263 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.777318 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.777336 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.777359 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.777425 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.881069 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.881134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.881151 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.881175 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.881194 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.984984 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.985052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.985069 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.985093 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:44 crc kubenswrapper[5008]: I0318 18:03:44.985114 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:44Z","lastTransitionTime":"2026-03-18T18:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.088169 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.088234 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.088252 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.088280 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.088297 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:45Z","lastTransitionTime":"2026-03-18T18:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.191640 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.191739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.191775 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.191808 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.191829 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:45Z","lastTransitionTime":"2026-03-18T18:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.296003 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.296083 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.296102 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.296126 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.296144 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:45Z","lastTransitionTime":"2026-03-18T18:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.399005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.399080 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.399103 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.399131 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.399151 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:45Z","lastTransitionTime":"2026-03-18T18:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.502504 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.502611 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.502636 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.502669 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.502691 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:45Z","lastTransitionTime":"2026-03-18T18:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.606917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.607017 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.607036 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.607059 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.607083 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:45Z","lastTransitionTime":"2026-03-18T18:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.710640 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.710719 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.710738 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.710765 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.710788 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:45Z","lastTransitionTime":"2026-03-18T18:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.814548 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.814659 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.814678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.814707 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.814724 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:45Z","lastTransitionTime":"2026-03-18T18:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.918116 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.918197 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.918216 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.918250 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:45 crc kubenswrapper[5008]: I0318 18:03:45.918278 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:45Z","lastTransitionTime":"2026-03-18T18:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.022359 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.022445 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.022476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.022507 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.022531 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.131102 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.131161 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.131179 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.131209 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.131234 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.198386 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.198421 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.198790 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:46 crc kubenswrapper[5008]: E0318 18:03:46.199169 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:46 crc kubenswrapper[5008]: E0318 18:03:46.199477 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:46 crc kubenswrapper[5008]: E0318 18:03:46.200097 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.234045 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.234125 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.234144 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.234174 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.234194 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.337884 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.337952 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.337970 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.337997 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.338014 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.441481 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.441611 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.441631 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.441661 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.441685 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.545450 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.545528 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.545546 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.545614 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.545638 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.649605 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.649686 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.649706 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.649735 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.649757 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.752734 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.752877 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.752902 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.752928 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.752992 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.856634 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.856720 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.856743 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.856779 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.856804 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.961326 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.961395 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.961420 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.961448 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:46 crc kubenswrapper[5008]: I0318 18:03:46.961469 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:46Z","lastTransitionTime":"2026-03-18T18:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.063788 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.063846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.063862 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.063885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.063902 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.167391 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.167482 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.167502 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.167590 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.167611 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.270918 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.270986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.271004 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.271027 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.271043 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.373647 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.373687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.373698 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.373714 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.373726 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.476770 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.476819 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.476836 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.476858 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.476875 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.580347 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.580414 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.580437 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.580467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.580488 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.684441 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.684503 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.684522 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.684544 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.684595 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.788187 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.788283 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.788314 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.788334 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.788347 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.891362 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.891427 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.891444 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.891468 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.891486 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.993312 5008 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.994763 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.994837 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.994861 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.994893 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:47 crc kubenswrapper[5008]: I0318 18:03:47.994919 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:47Z","lastTransitionTime":"2026-03-18T18:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.099467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.099523 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.099536 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.099583 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.099598 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:48Z","lastTransitionTime":"2026-03-18T18:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.198071 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.198164 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.198108 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:48 crc kubenswrapper[5008]: E0318 18:03:48.198289 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:48 crc kubenswrapper[5008]: E0318 18:03:48.199180 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:48 crc kubenswrapper[5008]: E0318 18:03:48.199035 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.203660 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.203704 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.203720 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.203744 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.203762 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:48Z","lastTransitionTime":"2026-03-18T18:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.307246 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.307309 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.307323 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.307345 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.307360 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:48Z","lastTransitionTime":"2026-03-18T18:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.410900 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.410971 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.410995 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.411030 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.411054 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:48Z","lastTransitionTime":"2026-03-18T18:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.514279 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.514328 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.514348 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.514370 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.514386 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:48Z","lastTransitionTime":"2026-03-18T18:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.617077 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.617155 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.617175 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.617198 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.617214 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:48Z","lastTransitionTime":"2026-03-18T18:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.721483 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.721611 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.721631 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.721657 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.721677 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:48Z","lastTransitionTime":"2026-03-18T18:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.825149 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.825213 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.825237 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.825266 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.825290 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:48Z","lastTransitionTime":"2026-03-18T18:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.928060 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.928108 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.928121 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.928143 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:48 crc kubenswrapper[5008]: I0318 18:03:48.928154 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:48Z","lastTransitionTime":"2026-03-18T18:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.031458 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.031534 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.031583 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.031609 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.031632 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.135241 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.135290 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.135306 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.135331 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.135347 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.241015 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.241094 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.241114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.241519 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.241771 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.345490 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.345963 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.345982 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.346029 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.346047 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.448058 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.448082 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.448090 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.448104 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.448113 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.550165 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.550211 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.550225 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.550244 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.550257 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.652929 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.652985 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.652998 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.653018 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.653032 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.755812 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.755850 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.755858 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.755871 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.755880 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.780771 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.780865 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.782504 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.813365 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.825309 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.838157 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.846797 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.857537 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.857981 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.858091 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.858174 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.858244 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.858301 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.868729 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.877407 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.889917 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.903592 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:49Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.918038 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:49Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.935984 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:49Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.953911 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:49Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.962209 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.962405 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.962495 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.962587 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.962670 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:49Z","lastTransitionTime":"2026-03-18T18:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.969821 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:49Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:49 crc kubenswrapper[5008]: I0318 18:03:49.986263 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:49Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.018536 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:50Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.034455 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:50Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.065449 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.065617 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.065678 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.065748 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.065823 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.168708 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.168841 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.168911 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.168986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.169057 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.198163 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:50 crc kubenswrapper[5008]: E0318 18:03:50.198427 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.198188 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:50 crc kubenswrapper[5008]: E0318 18:03:50.198706 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.198181 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:50 crc kubenswrapper[5008]: E0318 18:03:50.198943 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.272282 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.272355 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.272379 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.272412 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.272435 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.375474 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.375539 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.375588 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.375659 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.375683 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.478988 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.479551 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.479755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.479926 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.480120 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.589064 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.589127 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.589160 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.589187 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.589208 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.691730 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.692048 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.692141 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.692239 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.692323 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.794420 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.794476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.794487 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.794505 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.794518 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.897643 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.897682 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.897694 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.897713 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.897724 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.999688 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.999718 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.999726 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.999739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:50 crc kubenswrapper[5008]: I0318 18:03:50.999748 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:50Z","lastTransitionTime":"2026-03-18T18:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.103729 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.103772 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.103790 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.103812 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.103827 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:51Z","lastTransitionTime":"2026-03-18T18:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.206448 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.206490 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.206500 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.206517 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.206528 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:51Z","lastTransitionTime":"2026-03-18T18:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.310024 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.310096 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.310116 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.310142 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.310157 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:51Z","lastTransitionTime":"2026-03-18T18:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.412988 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.413033 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.413042 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.413054 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.413064 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:51Z","lastTransitionTime":"2026-03-18T18:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.515550 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.515629 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.515643 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.515667 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.515688 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:51Z","lastTransitionTime":"2026-03-18T18:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.619354 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.619397 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.619409 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.619431 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.619445 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:51Z","lastTransitionTime":"2026-03-18T18:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.722528 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.722649 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.722716 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.722752 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.722815 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:51Z","lastTransitionTime":"2026-03-18T18:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.826005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.826056 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.826074 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.826098 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.826116 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:51Z","lastTransitionTime":"2026-03-18T18:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.929031 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.929095 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.929115 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.929141 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.929161 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:51Z","lastTransitionTime":"2026-03-18T18:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.941480 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.941612 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.941661 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.941708 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:51 crc kubenswrapper[5008]: I0318 18:03:51.941757 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.941927 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.941965 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.941987 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.942059 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:07.942034919 +0000 UTC m=+104.461508038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.942594 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:07.942539902 +0000 UTC m=+104.462013021 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.942673 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.942726 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:07.942711867 +0000 UTC m=+104.462184986 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.942819 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.942874 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:07.942860421 +0000 UTC m=+104.462333540 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.942966 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.942998 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.943018 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:51 crc kubenswrapper[5008]: E0318 18:03:51.943062 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:07.943047855 +0000 UTC m=+104.462520974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.042273 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.042306 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.042314 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.042328 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.042337 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:52Z","lastTransitionTime":"2026-03-18T18:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.188052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.188347 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.188421 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.188528 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.188643 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:52Z","lastTransitionTime":"2026-03-18T18:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.197545 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:52 crc kubenswrapper[5008]: E0318 18:03:52.197736 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.198106 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:52 crc kubenswrapper[5008]: E0318 18:03:52.198230 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.198343 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:52 crc kubenswrapper[5008]: E0318 18:03:52.198453 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.291064 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.291112 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.291121 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.291136 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.291146 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:52Z","lastTransitionTime":"2026-03-18T18:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.393321 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.393366 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.393375 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.393391 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.393402 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:52Z","lastTransitionTime":"2026-03-18T18:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.496359 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.496784 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.496868 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.496956 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.497029 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:52Z","lastTransitionTime":"2026-03-18T18:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.599435 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.599819 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.599998 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.600199 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.600330 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:52Z","lastTransitionTime":"2026-03-18T18:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.702667 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.702906 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.703061 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.703207 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.703355 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:52Z","lastTransitionTime":"2026-03-18T18:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.791932 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.806038 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.806116 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.806144 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.806177 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.806198 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:52Z","lastTransitionTime":"2026-03-18T18:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.823858 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:52Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.844732 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:52Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.862630 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:52Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.880347 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:52Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.898152 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:52Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.909140 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.909370 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.909441 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.909511 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.909641 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:52Z","lastTransitionTime":"2026-03-18T18:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.920115 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:52Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.948660 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:52Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:52 crc kubenswrapper[5008]: I0318 18:03:52.962464 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:52Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.013538 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.013621 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.013640 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.013668 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.013688 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.116983 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.117034 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.117043 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.117059 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.117068 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.211605 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.220459 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.220687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.220703 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.220718 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.220733 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.329448 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.329691 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.329740 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.329775 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.329801 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.433190 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.434282 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.434431 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.434601 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.434744 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.537576 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.537609 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.537618 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.537635 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.537644 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.640074 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.640121 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.640133 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.640149 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.640163 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.743540 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.743613 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.743624 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.743641 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.743654 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.758817 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8nxl6"] Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.759311 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8nxl6" Mar 18 18:03:53 crc kubenswrapper[5008]: W0318 18:03:53.760837 5008 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 18 18:03:53 crc kubenswrapper[5008]: E0318 18:03:53.760896 5008 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.761375 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.762226 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.771604 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.790846 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.806063 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.820714 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.831686 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.845411 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.846650 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.846691 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.846702 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.846717 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.846731 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.857845 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2f0793b-3ae6-43d8-938e-f885d593d0a2-hosts-file\") pod \"node-resolver-8nxl6\" (UID: \"b2f0793b-3ae6-43d8-938e-f885d593d0a2\") " pod="openshift-dns/node-resolver-8nxl6" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.857933 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zkcd\" (UniqueName: \"kubernetes.io/projected/b2f0793b-3ae6-43d8-938e-f885d593d0a2-kube-api-access-8zkcd\") pod \"node-resolver-8nxl6\" (UID: \"b2f0793b-3ae6-43d8-938e-f885d593d0a2\") " pod="openshift-dns/node-resolver-8nxl6" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.862711 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.876186 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.890305 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.904169 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:53Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.949238 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.949292 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.949302 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.949317 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.949327 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:53Z","lastTransitionTime":"2026-03-18T18:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.959084 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zkcd\" (UniqueName: \"kubernetes.io/projected/b2f0793b-3ae6-43d8-938e-f885d593d0a2-kube-api-access-8zkcd\") pod \"node-resolver-8nxl6\" (UID: \"b2f0793b-3ae6-43d8-938e-f885d593d0a2\") " pod="openshift-dns/node-resolver-8nxl6" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.959125 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2f0793b-3ae6-43d8-938e-f885d593d0a2-hosts-file\") pod \"node-resolver-8nxl6\" (UID: \"b2f0793b-3ae6-43d8-938e-f885d593d0a2\") " pod="openshift-dns/node-resolver-8nxl6" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.959206 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b2f0793b-3ae6-43d8-938e-f885d593d0a2-hosts-file\") pod \"node-resolver-8nxl6\" (UID: \"b2f0793b-3ae6-43d8-938e-f885d593d0a2\") " pod="openshift-dns/node-resolver-8nxl6" Mar 18 18:03:53 crc kubenswrapper[5008]: I0318 18:03:53.992222 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zkcd\" (UniqueName: \"kubernetes.io/projected/b2f0793b-3ae6-43d8-938e-f885d593d0a2-kube-api-access-8zkcd\") pod \"node-resolver-8nxl6\" (UID: \"b2f0793b-3ae6-43d8-938e-f885d593d0a2\") " pod="openshift-dns/node-resolver-8nxl6" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.051547 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.051639 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.051658 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.051681 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.051697 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.127241 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-crzrt"] Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.127747 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-l6h7t"] Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.128265 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.128831 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.132019 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.132951 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.133270 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.133472 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.133846 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.134120 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.134337 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.134476 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.134678 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.134812 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.138162 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-sgv8s"] Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.138634 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.140491 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.141110 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.153828 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.153901 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.153925 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.153955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.153976 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.156197 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.164917 5008 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.173500 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.188691 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.198176 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.198214 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:54 crc kubenswrapper[5008]: E0318 18:03:54.198277 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.198302 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:54 crc kubenswrapper[5008]: E0318 18:03:54.198340 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:54 crc kubenswrapper[5008]: E0318 18:03:54.198619 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.201145 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.213206 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.226736 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.251782 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.256626 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.256677 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.256700 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.256720 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.256734 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261168 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-system-cni-dir\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261297 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-system-cni-dir\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261342 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-daemon-config\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261374 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de73a23f-7b17-40f3-bb5d-14c8bff178b9-proxy-tls\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261394 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-cni-binary-copy\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261417 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-run-netns\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261437 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-var-lib-cni-multus\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261490 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvmwq\" (UniqueName: \"kubernetes.io/projected/322f1eea-395d-476c-a43b-c68071d0af20-kube-api-access-jvmwq\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261537 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-socket-dir-parent\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261630 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-etc-kubernetes\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261799 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261899 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/322f1eea-395d-476c-a43b-c68071d0af20-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.261986 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-var-lib-kubelet\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262070 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46tr4\" (UniqueName: \"kubernetes.io/projected/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-kube-api-access-46tr4\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262126 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-cnibin\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262167 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-hostroot\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262228 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-cnibin\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262264 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/322f1eea-395d-476c-a43b-c68071d0af20-cni-binary-copy\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262336 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-conf-dir\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262419 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-run-multus-certs\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262539 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-os-release\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262614 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de73a23f-7b17-40f3-bb5d-14c8bff178b9-mcd-auth-proxy-config\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262646 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-os-release\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262691 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5mmg\" (UniqueName: \"kubernetes.io/projected/de73a23f-7b17-40f3-bb5d-14c8bff178b9-kube-api-access-c5mmg\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262738 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-var-lib-cni-bin\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262775 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-cni-dir\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262810 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-run-k8s-cni-cncf-io\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.262850 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/de73a23f-7b17-40f3-bb5d-14c8bff178b9-rootfs\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.267613 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.290296 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.303362 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.316448 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.332666 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.347607 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.359391 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.359428 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.359436 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.359451 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.359462 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.362403 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.363848 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-cni-dir\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.363891 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-run-k8s-cni-cncf-io\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.363911 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/de73a23f-7b17-40f3-bb5d-14c8bff178b9-rootfs\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.363949 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-system-cni-dir\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.363965 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-system-cni-dir\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.363965 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-run-k8s-cni-cncf-io\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.363982 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-daemon-config\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364021 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de73a23f-7b17-40f3-bb5d-14c8bff178b9-proxy-tls\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364031 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-cni-dir\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364064 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/de73a23f-7b17-40f3-bb5d-14c8bff178b9-rootfs\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364046 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-cni-binary-copy\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364145 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-system-cni-dir\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364139 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-system-cni-dir\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364185 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-run-netns\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364235 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-run-netns\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364262 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-var-lib-cni-multus\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364308 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvmwq\" (UniqueName: \"kubernetes.io/projected/322f1eea-395d-476c-a43b-c68071d0af20-kube-api-access-jvmwq\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364345 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-socket-dir-parent\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364375 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-var-lib-cni-multus\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364380 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-etc-kubernetes\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364425 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-etc-kubernetes\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364450 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364476 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/322f1eea-395d-476c-a43b-c68071d0af20-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364503 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-var-lib-kubelet\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364525 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46tr4\" (UniqueName: \"kubernetes.io/projected/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-kube-api-access-46tr4\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364548 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-cnibin\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364588 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-hostroot\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364630 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-cnibin\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364644 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-daemon-config\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364653 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/322f1eea-395d-476c-a43b-c68071d0af20-cni-binary-copy\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364675 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-conf-dir\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364697 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-run-multus-certs\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364721 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-os-release\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364748 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de73a23f-7b17-40f3-bb5d-14c8bff178b9-mcd-auth-proxy-config\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364769 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-os-release\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364792 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5mmg\" (UniqueName: \"kubernetes.io/projected/de73a23f-7b17-40f3-bb5d-14c8bff178b9-kube-api-access-c5mmg\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364807 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-cni-binary-copy\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364812 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-var-lib-cni-bin\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364840 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-var-lib-cni-bin\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364910 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-os-release\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364916 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-socket-dir-parent\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364941 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-run-multus-certs\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364973 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-hostroot\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.364971 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-multus-conf-dir\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.365007 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-os-release\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.365019 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-cnibin\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.365087 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-cnibin\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.365306 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-host-var-lib-kubelet\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.365610 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de73a23f-7b17-40f3-bb5d-14c8bff178b9-mcd-auth-proxy-config\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.365747 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/322f1eea-395d-476c-a43b-c68071d0af20-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.365888 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/322f1eea-395d-476c-a43b-c68071d0af20-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.366164 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/322f1eea-395d-476c-a43b-c68071d0af20-cni-binary-copy\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.377513 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de73a23f-7b17-40f3-bb5d-14c8bff178b9-proxy-tls\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.380839 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46tr4\" (UniqueName: \"kubernetes.io/projected/9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd-kube-api-access-46tr4\") pod \"multus-sgv8s\" (UID: \"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\") " pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.381293 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.387090 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5mmg\" (UniqueName: \"kubernetes.io/projected/de73a23f-7b17-40f3-bb5d-14c8bff178b9-kube-api-access-c5mmg\") pod \"machine-config-daemon-crzrt\" (UID: \"de73a23f-7b17-40f3-bb5d-14c8bff178b9\") " pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.388135 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvmwq\" (UniqueName: \"kubernetes.io/projected/322f1eea-395d-476c-a43b-c68071d0af20-kube-api-access-jvmwq\") pod \"multus-additional-cni-plugins-l6h7t\" (UID: \"322f1eea-395d-476c-a43b-c68071d0af20\") " pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.396424 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.414216 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.429740 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.443151 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.456964 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.459234 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.462019 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.462071 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.462088 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.462113 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.462131 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.467516 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.473969 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgv8s" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.474890 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: W0318 18:03:54.486878 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde73a23f_7b17_40f3_bb5d_14c8bff178b9.slice/crio-71232e2b29544c7a077d49a630c0a5cfcc8bb13c847d53318e03fd88ac61c419 WatchSource:0}: Error finding container 71232e2b29544c7a077d49a630c0a5cfcc8bb13c847d53318e03fd88ac61c419: Status 404 returned error can't find the container with id 71232e2b29544c7a077d49a630c0a5cfcc8bb13c847d53318e03fd88ac61c419 Mar 18 18:03:54 crc kubenswrapper[5008]: W0318 18:03:54.487324 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod322f1eea_395d_476c_a43b_c68071d0af20.slice/crio-d08fa64abe98e650ab554821fa6dd668949217e3775d21b21f52b0449a1b3e94 WatchSource:0}: Error finding container d08fa64abe98e650ab554821fa6dd668949217e3775d21b21f52b0449a1b3e94: Status 404 returned error can't find the container with id d08fa64abe98e650ab554821fa6dd668949217e3775d21b21f52b0449a1b3e94 Mar 18 18:03:54 crc kubenswrapper[5008]: W0318 18:03:54.490933 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8d2b81_71c9_44b4_86ad_8a3ec4c0c2dd.slice/crio-f21c462fe8745439b86249fbccf21e88ac952448bdfb69d7e60ebc1d10e6fe41 WatchSource:0}: Error finding container f21c462fe8745439b86249fbccf21e88ac952448bdfb69d7e60ebc1d10e6fe41: Status 404 returned error can't find the container with id f21c462fe8745439b86249fbccf21e88ac952448bdfb69d7e60ebc1d10e6fe41 Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.498748 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.501726 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5278w"] Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.503893 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.508478 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.508524 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.508478 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.508961 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.509174 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.509366 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.509465 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.518891 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.521092 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.521132 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.521143 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.521206 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.521219 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.538981 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: E0318 18:03:54.542265 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.547958 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.548009 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.548022 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.548045 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.548058 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.556376 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: E0318 18:03:54.563135 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.569520 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.575985 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.576133 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.576146 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.576175 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.576190 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.581441 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: E0318 18:03:54.597894 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.598686 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.603280 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.603311 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.603321 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.603341 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.603353 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.614955 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: E0318 18:03:54.619375 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.624083 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.624114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.624129 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.624149 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.624163 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.628580 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: E0318 18:03:54.639337 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: E0318 18:03:54.639458 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.645087 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.645110 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.645119 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.645154 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.645801 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.657051 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668344 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-systemd\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668397 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-log-socket\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668421 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-ovn-kubernetes\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668449 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-node-log\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668467 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-systemd-units\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668485 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-openvswitch\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668518 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-kubelet\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668537 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-netns\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668579 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-etc-openvswitch\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668597 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-ovn\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668622 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-env-overrides\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668644 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-slash\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668663 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hqn\" (UniqueName: \"kubernetes.io/projected/b105c010-f5cb-41ae-bdff-62bc05da91a1-kube-api-access-29hqn\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668686 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-var-lib-openvswitch\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668704 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-bin\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668724 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovn-node-metrics-cert\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668749 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-script-lib\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668766 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-config\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668797 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-netd\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.668816 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.675895 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.691318 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.704190 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.729000 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.745213 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.747861 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.747893 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.747904 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.747920 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.747940 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.761821 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769799 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-env-overrides\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769850 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-slash\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769869 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29hqn\" (UniqueName: \"kubernetes.io/projected/b105c010-f5cb-41ae-bdff-62bc05da91a1-kube-api-access-29hqn\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769889 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-var-lib-openvswitch\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769906 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-bin\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769926 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovn-node-metrics-cert\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769950 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-script-lib\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769966 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-config\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769981 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-netd\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.769996 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770025 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-systemd\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770041 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-log-socket\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770055 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-ovn-kubernetes\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770071 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-node-log\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770087 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-systemd-units\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770103 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-openvswitch\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770130 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-kubelet\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770143 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-netns\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770168 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-etc-openvswitch\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770184 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-ovn\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770244 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-ovn\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770279 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-slash\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770590 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-log-socket\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770615 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-ovn-kubernetes\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770636 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-node-log\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770640 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-systemd\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770661 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-systemd-units\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770686 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-openvswitch\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770709 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-kubelet\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770709 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-var-lib-openvswitch\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770732 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-bin\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770752 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-netns\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770777 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-etc-openvswitch\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.770914 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-env-overrides\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.771060 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-netd\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.771073 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.771453 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-config\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.771895 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-script-lib\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.774984 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovn-node-metrics-cert\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.784263 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.792333 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29hqn\" (UniqueName: \"kubernetes.io/projected/b105c010-f5cb-41ae-bdff-62bc05da91a1-kube-api-access-29hqn\") pod \"ovnkube-node-5278w\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.799475 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.799695 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.799779 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"71232e2b29544c7a077d49a630c0a5cfcc8bb13c847d53318e03fd88ac61c419"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.801579 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgv8s" event={"ID":"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd","Type":"ContainerStarted","Data":"4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.801638 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgv8s" event={"ID":"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd","Type":"ContainerStarted","Data":"f21c462fe8745439b86249fbccf21e88ac952448bdfb69d7e60ebc1d10e6fe41"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.802352 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.802952 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" event={"ID":"322f1eea-395d-476c-a43b-c68071d0af20","Type":"ContainerStarted","Data":"880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.802999 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" event={"ID":"322f1eea-395d-476c-a43b-c68071d0af20","Type":"ContainerStarted","Data":"d08fa64abe98e650ab554821fa6dd668949217e3775d21b21f52b0449a1b3e94"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.821866 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:03:54 crc kubenswrapper[5008]: W0318 18:03:54.835008 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb105c010_f5cb_41ae_bdff_62bc05da91a1.slice/crio-0444e493c681e3b62a7e6d7372ec2335c13d32de42341be333ec7f12ccc56662 WatchSource:0}: Error finding container 0444e493c681e3b62a7e6d7372ec2335c13d32de42341be333ec7f12ccc56662: Status 404 returned error can't find the container with id 0444e493c681e3b62a7e6d7372ec2335c13d32de42341be333ec7f12ccc56662 Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.841025 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.850239 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.850284 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.850325 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.850345 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.850356 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.859862 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.890039 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.908328 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.943453 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.953517 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.953584 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.953608 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.953624 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.953633 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:54Z","lastTransitionTime":"2026-03-18T18:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.962105 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.972135 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.984931 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:54 crc kubenswrapper[5008]: I0318 18:03:54.996138 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.009272 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.024175 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.043083 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.057375 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.057421 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.057432 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.057450 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.057459 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.061926 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.080868 5008 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-dns/node-resolver-8nxl6" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.080959 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8nxl6" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.081425 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.098896 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: W0318 18:03:55.100841 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2f0793b_3ae6_43d8_938e_f885d593d0a2.slice/crio-2c3ae2a4f89494f7fea6ef7ba8d163a5ffae7b362200a491dc31ec694225cb66 WatchSource:0}: Error finding container 2c3ae2a4f89494f7fea6ef7ba8d163a5ffae7b362200a491dc31ec694225cb66: Status 404 returned error can't find the container with id 2c3ae2a4f89494f7fea6ef7ba8d163a5ffae7b362200a491dc31ec694225cb66 Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.121539 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.143964 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.156465 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.166049 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.166103 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.166117 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.166136 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.166150 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.168780 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.183960 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.197298 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.208868 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.223610 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.242173 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.269198 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.269234 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.269243 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.269259 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.269270 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.273325 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.315650 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.352677 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.372157 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.372203 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.372216 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.372238 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.372250 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.409403 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.475854 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.475901 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.475913 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.475932 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.475948 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.579138 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.579208 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.579225 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.579252 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.579271 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.682843 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.682890 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.682902 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.682920 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.682932 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.785986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.786413 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.786425 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.786447 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.786460 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.808404 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a" exitCode=0 Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.808502 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.808667 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"0444e493c681e3b62a7e6d7372ec2335c13d32de42341be333ec7f12ccc56662"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.810424 5008 generic.go:334] "Generic (PLEG): container finished" podID="322f1eea-395d-476c-a43b-c68071d0af20" containerID="880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39" exitCode=0 Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.810472 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" event={"ID":"322f1eea-395d-476c-a43b-c68071d0af20","Type":"ContainerDied","Data":"880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.815725 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8nxl6" event={"ID":"b2f0793b-3ae6-43d8-938e-f885d593d0a2","Type":"ContainerStarted","Data":"d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.815820 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8nxl6" event={"ID":"b2f0793b-3ae6-43d8-938e-f885d593d0a2","Type":"ContainerStarted","Data":"2c3ae2a4f89494f7fea6ef7ba8d163a5ffae7b362200a491dc31ec694225cb66"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.846341 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.864195 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.879668 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.888955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.888988 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.888999 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.889018 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.889031 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.897459 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.911233 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.924630 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.940639 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.957304 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.973043 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.990592 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:55Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.993888 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.993921 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.993935 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.993954 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:55 crc kubenswrapper[5008]: I0318 18:03:55.993969 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:55Z","lastTransitionTime":"2026-03-18T18:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.014155 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.031841 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.044623 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.058814 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.073509 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.100097 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.100911 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.100942 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.100951 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.100966 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.100975 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.119350 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.136356 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.155846 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.198073 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.198184 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:56 crc kubenswrapper[5008]: E0318 18:03:56.198338 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.198459 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:56 crc kubenswrapper[5008]: E0318 18:03:56.198544 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:56 crc kubenswrapper[5008]: E0318 18:03:56.198825 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.199016 5008 scope.go:117] "RemoveContainer" containerID="1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960" Mar 18 18:03:56 crc kubenswrapper[5008]: E0318 18:03:56.199155 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.200239 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.207777 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.207933 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.207995 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.208086 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.208148 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.229290 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.267722 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.311975 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.316500 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.316539 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.316547 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.316581 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.316590 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.348634 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.384400 5008 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.412470 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.420568 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.420606 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.420616 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.420635 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.420645 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.462194 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.490018 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.522439 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.522466 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.522474 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.522487 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.522496 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.531322 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.625636 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.625686 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.625699 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.625720 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.625735 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.727917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.727979 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.727989 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.728022 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.728034 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.822585 5008 generic.go:334] "Generic (PLEG): container finished" podID="322f1eea-395d-476c-a43b-c68071d0af20" containerID="ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5" exitCode=0 Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.822618 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" event={"ID":"322f1eea-395d-476c-a43b-c68071d0af20","Type":"ContainerDied","Data":"ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.828154 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.828207 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.828228 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.828247 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.828270 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.828289 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.829723 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.829760 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.829771 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.829785 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.829795 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.833718 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.859221 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.882227 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.902132 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.918722 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.935416 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.935485 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.935505 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.935531 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.935548 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:56Z","lastTransitionTime":"2026-03-18T18:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.935792 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.951808 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.973741 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:56 crc kubenswrapper[5008]: I0318 18:03:56.997459 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.011847 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.028504 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.038088 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.038126 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.038139 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.038158 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.038170 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.046812 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.059770 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.089685 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.142701 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.142755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.142768 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.142790 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.142808 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.247101 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.247162 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.247185 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.247214 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.247237 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.349817 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.349908 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.349934 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.349967 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.349996 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.452819 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.452872 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.452892 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.452913 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.452933 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.556712 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.556776 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.556787 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.556807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.556822 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.660023 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.660090 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.660109 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.660135 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.660155 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.762858 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.762899 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.762911 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.762928 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.762938 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.844091 5008 generic.go:334] "Generic (PLEG): container finished" podID="322f1eea-395d-476c-a43b-c68071d0af20" containerID="d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f" exitCode=0 Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.844167 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" event={"ID":"322f1eea-395d-476c-a43b-c68071d0af20","Type":"ContainerDied","Data":"d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.865516 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.871504 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.871603 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.871614 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.871632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.871644 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.881145 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.899116 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.921533 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.938613 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.951863 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.964366 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.974230 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.974289 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.974320 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.974337 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.974381 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:57Z","lastTransitionTime":"2026-03-18T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.975818 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:57 crc kubenswrapper[5008]: I0318 18:03:57.990525 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.001686 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.020224 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.036408 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.055087 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.069007 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.077364 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.077581 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.077674 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.077747 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.077830 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.180417 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.180690 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.180750 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.180815 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.180880 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.197709 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:03:58 crc kubenswrapper[5008]: E0318 18:03:58.197911 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.198279 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:03:58 crc kubenswrapper[5008]: E0318 18:03:58.198340 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.199787 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:03:58 crc kubenswrapper[5008]: E0318 18:03:58.199968 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.283877 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.283936 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.283947 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.283969 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.283986 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.386884 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.386943 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.386962 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.386988 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.387013 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.490853 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.490938 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.490962 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.490997 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.491017 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.594134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.594668 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.594801 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.594895 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.594952 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.697691 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.697946 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.698011 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.698089 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.698171 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.809528 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.809589 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.809603 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.809623 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.809637 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.858059 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.863241 5008 generic.go:334] "Generic (PLEG): container finished" podID="322f1eea-395d-476c-a43b-c68071d0af20" containerID="3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207" exitCode=0 Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.863286 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" event={"ID":"322f1eea-395d-476c-a43b-c68071d0af20","Type":"ContainerDied","Data":"3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.882609 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.899118 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.914109 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.914147 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.914156 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.914174 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.914182 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:58Z","lastTransitionTime":"2026-03-18T18:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.914506 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.924597 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.941375 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.966976 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.979959 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:58 crc kubenswrapper[5008]: I0318 18:03:58.993147 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:58Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.004754 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.014387 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.015987 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.016009 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.016018 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.016032 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.016041 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.023746 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.033859 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.052248 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.064637 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.119503 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.119698 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.119764 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.119839 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.119903 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.222251 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.222302 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.222317 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.222336 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.222349 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.324514 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.324781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.324790 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.324807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.324817 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.428479 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.428610 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.428639 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.428687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.428716 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.533393 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.533457 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.533476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.533501 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.533519 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.637185 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.637226 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.637236 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.637262 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.637277 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.740680 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.741076 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.741107 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.741134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.741151 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.844738 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.845147 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.845336 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.845382 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.845402 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.871343 5008 generic.go:334] "Generic (PLEG): container finished" podID="322f1eea-395d-476c-a43b-c68071d0af20" containerID="b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305" exitCode=0 Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.871386 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" event={"ID":"322f1eea-395d-476c-a43b-c68071d0af20","Type":"ContainerDied","Data":"b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.895147 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.912946 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.933119 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.963759 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.963848 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.963862 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.963909 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.963930 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:03:59Z","lastTransitionTime":"2026-03-18T18:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:03:59 crc kubenswrapper[5008]: I0318 18:03:59.964757 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:03:59Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.021428 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.043423 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.066486 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.066570 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.066588 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.066612 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.066628 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.068108 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.088207 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.104332 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.125446 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.149428 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.167746 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.169145 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.169166 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.169176 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.169190 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.169201 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.186214 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.197670 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.197716 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.197756 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:00 crc kubenswrapper[5008]: E0318 18:04:00.197782 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:00 crc kubenswrapper[5008]: E0318 18:04:00.197895 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:00 crc kubenswrapper[5008]: E0318 18:04:00.198047 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.206788 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.271897 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.271937 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.271946 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.271962 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.271970 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.375315 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.375375 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.375393 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.375418 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.375436 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.480617 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-b8t8h"] Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.481173 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.486047 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.486083 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.486094 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.486109 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.486118 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.486323 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.486813 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.486844 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.486860 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.509353 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.521511 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.535472 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.551472 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.572259 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.588263 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.589277 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.589325 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.589382 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.589407 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.589425 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.604288 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.622003 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.636798 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5dae087e-43c5-442e-98db-b815e8993c8d-host\") pod \"node-ca-b8t8h\" (UID: \"5dae087e-43c5-442e-98db-b815e8993c8d\") " pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.636930 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5dae087e-43c5-442e-98db-b815e8993c8d-serviceca\") pod \"node-ca-b8t8h\" (UID: \"5dae087e-43c5-442e-98db-b815e8993c8d\") " pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.636976 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc54p\" (UniqueName: \"kubernetes.io/projected/5dae087e-43c5-442e-98db-b815e8993c8d-kube-api-access-lc54p\") pod \"node-ca-b8t8h\" (UID: \"5dae087e-43c5-442e-98db-b815e8993c8d\") " pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.644075 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.656593 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.666335 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.691843 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.691899 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.691914 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.691940 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.691959 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.694684 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.713830 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.733973 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.737971 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5dae087e-43c5-442e-98db-b815e8993c8d-serviceca\") pod \"node-ca-b8t8h\" (UID: \"5dae087e-43c5-442e-98db-b815e8993c8d\") " pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.738257 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc54p\" (UniqueName: \"kubernetes.io/projected/5dae087e-43c5-442e-98db-b815e8993c8d-kube-api-access-lc54p\") pod \"node-ca-b8t8h\" (UID: \"5dae087e-43c5-442e-98db-b815e8993c8d\") " pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.738388 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5dae087e-43c5-442e-98db-b815e8993c8d-host\") pod \"node-ca-b8t8h\" (UID: \"5dae087e-43c5-442e-98db-b815e8993c8d\") " pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.738481 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5dae087e-43c5-442e-98db-b815e8993c8d-host\") pod \"node-ca-b8t8h\" (UID: \"5dae087e-43c5-442e-98db-b815e8993c8d\") " pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.740315 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5dae087e-43c5-442e-98db-b815e8993c8d-serviceca\") pod \"node-ca-b8t8h\" (UID: \"5dae087e-43c5-442e-98db-b815e8993c8d\") " pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.746132 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.758225 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc54p\" (UniqueName: \"kubernetes.io/projected/5dae087e-43c5-442e-98db-b815e8993c8d-kube-api-access-lc54p\") pod \"node-ca-b8t8h\" (UID: \"5dae087e-43c5-442e-98db-b815e8993c8d\") " pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.795971 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.796051 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.796078 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.796112 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.796136 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.811282 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b8t8h" Mar 18 18:04:00 crc kubenswrapper[5008]: W0318 18:04:00.841634 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dae087e_43c5_442e_98db_b815e8993c8d.slice/crio-a6354a4d0b2202773e1b53445bb820c32c55c6bfc4d98e25b00e815b8c986d6e WatchSource:0}: Error finding container a6354a4d0b2202773e1b53445bb820c32c55c6bfc4d98e25b00e815b8c986d6e: Status 404 returned error can't find the container with id a6354a4d0b2202773e1b53445bb820c32c55c6bfc4d98e25b00e815b8c986d6e Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.894507 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b8t8h" event={"ID":"5dae087e-43c5-442e-98db-b815e8993c8d","Type":"ContainerStarted","Data":"a6354a4d0b2202773e1b53445bb820c32c55c6bfc4d98e25b00e815b8c986d6e"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.898667 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.898706 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.898724 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.898747 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.898766 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:00Z","lastTransitionTime":"2026-03-18T18:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.902639 5008 generic.go:334] "Generic (PLEG): container finished" podID="322f1eea-395d-476c-a43b-c68071d0af20" containerID="239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5" exitCode=0 Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.902704 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" event={"ID":"322f1eea-395d-476c-a43b-c68071d0af20","Type":"ContainerDied","Data":"239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5"} Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.924123 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.953141 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.972062 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:00 crc kubenswrapper[5008]: I0318 18:04:00.986404 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:00Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.001456 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.001542 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.001618 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.001645 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.001699 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.002592 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.028548 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.040512 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.056607 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.071504 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.083484 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.097486 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.103735 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.103769 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.103781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.103799 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.103811 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.108913 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.124368 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.145840 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.157367 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.206209 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.206259 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.206271 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.206292 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.206308 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.309310 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.309373 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.309384 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.309405 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.309420 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.411902 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.411937 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.411972 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.411990 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.412002 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.518873 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.518921 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.518938 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.518960 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.518978 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.623216 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.623300 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.623319 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.623350 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.623374 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.726573 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.726608 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.726618 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.726632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.726644 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.831425 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.831917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.831934 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.831956 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.831973 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.909265 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b8t8h" event={"ID":"5dae087e-43c5-442e-98db-b815e8993c8d","Type":"ContainerStarted","Data":"25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.919414 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" event={"ID":"322f1eea-395d-476c-a43b-c68071d0af20","Type":"ContainerStarted","Data":"e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.932314 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.933462 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.934655 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.934931 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.935000 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.936008 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.936096 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.936126 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.936165 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.936190 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:01Z","lastTransitionTime":"2026-03-18T18:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:01 crc kubenswrapper[5008]: I0318 18:04:01.962816 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:01.986122 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:01Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.005279 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.019127 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.020852 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.026399 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.055413 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.055846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.055890 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.055919 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.055941 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.059083 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.080415 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.107888 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.139189 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.159804 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.159896 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.159921 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.160432 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.160757 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.166270 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.183695 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.197426 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.197503 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:02 crc kubenswrapper[5008]: E0318 18:04:02.197678 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.197786 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:02 crc kubenswrapper[5008]: E0318 18:04:02.197962 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:02 crc kubenswrapper[5008]: E0318 18:04:02.198098 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.210392 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.229453 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.246064 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.263090 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.265885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.265979 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.265999 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.266031 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.266051 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.279835 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.297094 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.313100 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.328655 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.342214 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.355335 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.369025 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.369073 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.369089 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.369117 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.369144 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.377648 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.398804 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.416526 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.431966 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.446729 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.461383 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.472696 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.472732 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.472744 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.472763 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.472774 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.481979 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.499659 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.508727 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:02Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.574930 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.574971 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.574979 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.574996 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.575006 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.678911 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.679020 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.679045 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.679082 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.679106 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.783169 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.783240 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.783259 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.783287 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.783308 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.886659 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.886735 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.886754 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.886781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.886802 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.990242 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.990309 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.990327 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.990355 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:02 crc kubenswrapper[5008]: I0318 18:04:02.990374 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:02Z","lastTransitionTime":"2026-03-18T18:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.094409 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.094473 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.094493 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.094521 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.094540 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.199204 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.199261 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.199280 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.199303 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.199321 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.367730 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.367781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.367799 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.367821 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.367837 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.471915 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.472489 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.472722 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.472877 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.473008 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.577155 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.577214 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.577348 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.577387 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.577411 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.680190 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.680232 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.680246 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.680264 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.680278 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.782955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.782996 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.783014 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.783036 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.783050 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.886716 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.886757 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.886770 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.886788 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.886801 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.989830 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.989885 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.989903 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.989929 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:03 crc kubenswrapper[5008]: I0318 18:04:03.989947 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:03Z","lastTransitionTime":"2026-03-18T18:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.092915 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.093269 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.093379 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.093519 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.093671 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.197365 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.197395 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:04 crc kubenswrapper[5008]: E0318 18:04:04.197488 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:04 crc kubenswrapper[5008]: E0318 18:04:04.197662 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.197803 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:04 crc kubenswrapper[5008]: E0318 18:04:04.197894 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.198223 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.198370 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.198523 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.198741 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.198917 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.222341 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.244915 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.266414 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.282158 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.298819 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.303257 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.303352 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.303368 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.303390 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.303409 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.323615 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.356073 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.376178 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.400570 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.406414 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.406464 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.406472 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.406485 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.406495 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.420856 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.432647 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.441939 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.463570 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.492151 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.506506 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.510215 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.510267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.510284 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.510308 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.510326 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.614549 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.614602 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.614610 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.614624 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.614634 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.716884 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.716998 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.717020 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.717051 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.717075 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.820486 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.820547 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.820592 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.820616 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.820634 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.875420 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.875486 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.875502 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.875527 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.875539 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: E0318 18:04:04.894611 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.899891 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.899968 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.899987 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.900033 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.900054 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: E0318 18:04:04.922109 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.928305 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.928374 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.928394 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.928423 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.928442 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: E0318 18:04:04.948973 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.949467 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/0.log" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.954669 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.954737 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.954763 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.954797 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.954825 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.955265 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74" exitCode=1 Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.955339 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74"} Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.956890 5008 scope.go:117] "RemoveContainer" containerID="3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.981396 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: E0318 18:04:04.981635 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:04Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.989417 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.990052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.990079 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.990121 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:04 crc kubenswrapper[5008]: I0318 18:04:04.990147 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:04Z","lastTransitionTime":"2026-03-18T18:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.008371 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: E0318 18:04:05.011301 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: E0318 18:04:05.011601 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.014358 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.014412 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.014428 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.014452 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.014471 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.036148 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.066503 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"val\\\\nI0318 18:04:04.397208 6802 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 18:04:04.397223 6802 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 18:04:04.397248 6802 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 18:04:04.397276 6802 factory.go:656] Stopping watch factory\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 18:04:04.397302 6802 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 18:04:04.397307 6802 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 18:04:04.397299 6802 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0318 18:04:04.397333 6802 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 18:04:04.397337 6802 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.082118 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.099021 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.116965 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.117033 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.117058 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.117087 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.117106 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.121009 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.144259 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.161269 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.185172 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.199737 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.220267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.220296 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.220304 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.220318 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.220328 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.237950 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.255297 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.270649 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.285647 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.324293 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.324351 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.324371 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.324396 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.324414 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.426494 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.426542 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.426582 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.426607 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.426620 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.530037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.530067 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.530077 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.530090 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.530102 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.632006 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.632041 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.632052 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.632067 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.632080 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.734798 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.734827 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.734835 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.734847 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.734856 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.837986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.838073 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.838097 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.838128 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.838147 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.940395 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.940419 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.940428 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.940441 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.940450 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:05Z","lastTransitionTime":"2026-03-18T18:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.961104 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/0.log" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.964092 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29"} Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.964590 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:04:05 crc kubenswrapper[5008]: I0318 18:04:05.985757 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:05Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.014199 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.039859 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.043222 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.043290 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.043305 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.043328 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.043344 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.078535 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.102291 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.114953 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.139004 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.148631 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.148663 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.148672 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.148687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.148698 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.155924 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.178197 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.194025 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.198322 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.198376 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:06 crc kubenswrapper[5008]: E0318 18:04:06.198493 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.198582 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:06 crc kubenswrapper[5008]: E0318 18:04:06.198752 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:06 crc kubenswrapper[5008]: E0318 18:04:06.198894 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.211416 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.226446 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.248374 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.251292 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.251354 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.251373 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.251397 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.251415 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.283411 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"val\\\\nI0318 18:04:04.397208 6802 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 18:04:04.397223 6802 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 18:04:04.397248 6802 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 18:04:04.397276 6802 factory.go:656] Stopping watch factory\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 18:04:04.397302 6802 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 18:04:04.397307 6802 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 18:04:04.397299 6802 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0318 18:04:04.397333 6802 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 18:04:04.397337 6802 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.296612 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.354708 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.354767 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.354779 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.354795 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.354807 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.458187 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.458228 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.458239 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.458252 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.458261 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.561534 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.561688 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.561707 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.561734 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.561753 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.650527 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq"] Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.651024 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.653534 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.653792 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.664797 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.664830 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.664841 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.664856 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.664893 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.679789 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.698432 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.698495 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxck\" (UniqueName: \"kubernetes.io/projected/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-kube-api-access-rbxck\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.698798 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.698898 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.703423 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.723691 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.741838 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.768050 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.768092 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.768104 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.768125 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.768140 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.769003 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.799602 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.799721 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.799796 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.799847 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxck\" (UniqueName: \"kubernetes.io/projected/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-kube-api-access-rbxck\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.801014 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.801177 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.805134 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"val\\\\nI0318 18:04:04.397208 6802 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 18:04:04.397223 6802 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 18:04:04.397248 6802 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 18:04:04.397276 6802 factory.go:656] Stopping watch factory\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 18:04:04.397302 6802 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 18:04:04.397307 6802 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 18:04:04.397299 6802 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0318 18:04:04.397333 6802 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 18:04:04.397337 6802 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.810126 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.824280 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.833759 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxck\" (UniqueName: \"kubernetes.io/projected/ff5b1f8d-21ca-4a18-952a-bbc202aeb521-kube-api-access-rbxck\") pod \"ovnkube-control-plane-749d76644c-vjsrq\" (UID: \"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.843168 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.860329 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.873054 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.873126 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.873148 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.873177 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.873197 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.882303 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.898292 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.914776 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.933238 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.950282 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.971023 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/1.log" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.972009 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/0.log" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.975644 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.975739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.975766 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.975844 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.975872 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:06Z","lastTransitionTime":"2026-03-18T18:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.977717 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29" exitCode=1 Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.977840 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29"} Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.977942 5008 scope.go:117] "RemoveContainer" containerID="3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.978128 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.979617 5008 scope.go:117] "RemoveContainer" containerID="af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29" Mar 18 18:04:06 crc kubenswrapper[5008]: E0318 18:04:06.980014 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:04:06 crc kubenswrapper[5008]: I0318 18:04:06.992858 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:06Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: W0318 18:04:07.001684 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff5b1f8d_21ca_4a18_952a_bbc202aeb521.slice/crio-6378a1277b421723fb97659c5b3a4862916737c879f4666855d82bc8128228f6 WatchSource:0}: Error finding container 6378a1277b421723fb97659c5b3a4862916737c879f4666855d82bc8128228f6: Status 404 returned error can't find the container with id 6378a1277b421723fb97659c5b3a4862916737c879f4666855d82bc8128228f6 Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.014164 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.029443 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.044005 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.055489 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.073206 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.078606 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.078660 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.078673 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.078717 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.078730 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.091043 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"val\\\\nI0318 18:04:04.397208 6802 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 18:04:04.397223 6802 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 18:04:04.397248 6802 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 18:04:04.397276 6802 factory.go:656] Stopping watch factory\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 18:04:04.397302 6802 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 18:04:04.397307 6802 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 18:04:04.397299 6802 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0318 18:04:04.397333 6802 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 18:04:04.397337 6802 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071522 6973 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071416 6973 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 18:04:06.071646 6973 ovnkube.go:599] Stopped ovnkube\\\\nI0318 18:04:06.071669 6973 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 18:04:06.071751 6973 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.102099 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.112578 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.122690 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.133470 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.143759 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.154664 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.165435 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.181192 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.181220 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.181232 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.181246 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.181257 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.191861 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.197826 5008 scope.go:117] "RemoveContainer" containerID="1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960" Mar 18 18:04:07 crc kubenswrapper[5008]: E0318 18:04:07.198114 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.205789 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.221210 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.233606 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.284794 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.285574 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.285590 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.285608 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.285622 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.387383 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-g2z9p"] Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.387955 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:07 crc kubenswrapper[5008]: E0318 18:04:07.388023 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.389747 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.389787 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.389799 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.389812 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.389823 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.404966 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.405031 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7tw9\" (UniqueName: \"kubernetes.io/projected/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-kube-api-access-n7tw9\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.405991 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.421541 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.446572 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.461138 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.476105 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.492320 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.492385 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.492398 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.492417 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.492431 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.492932 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.505480 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.505536 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7tw9\" (UniqueName: \"kubernetes.io/projected/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-kube-api-access-n7tw9\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:07 crc kubenswrapper[5008]: E0318 18:04:07.505653 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:07 crc kubenswrapper[5008]: E0318 18:04:07.505727 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs podName:1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:08.00570722 +0000 UTC m=+104.525180309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs") pod "network-metrics-daemon-g2z9p" (UID: "1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.510337 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.531566 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7tw9\" (UniqueName: \"kubernetes.io/projected/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-kube-api-access-n7tw9\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.539433 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.555915 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.571685 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.586144 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.594667 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.594719 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.594733 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.594752 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.594764 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.602177 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.615593 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.625588 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.641238 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.658823 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ef3633eb8062b2fbb63bef4608c17f733a20b0fafbc04531ed90107d4864d74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:04Z\\\",\\\"message\\\":\\\"val\\\\nI0318 18:04:04.397208 6802 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 18:04:04.397223 6802 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0318 18:04:04.397248 6802 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 18:04:04.397276 6802 factory.go:656] Stopping watch factory\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 18:04:04.397293 6802 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 18:04:04.397302 6802 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0318 18:04:04.397307 6802 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 18:04:04.397299 6802 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0318 18:04:04.397333 6802 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 18:04:04.397337 6802 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071522 6973 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071416 6973 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 18:04:06.071646 6973 ovnkube.go:599] Stopped ovnkube\\\\nI0318 18:04:06.071669 6973 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 18:04:06.071751 6973 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.667716 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:07Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.697629 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.697694 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.697706 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.697723 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.697733 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.799677 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.799722 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.799734 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.799751 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.799762 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.902962 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.903028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.903065 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.903103 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.903126 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:07Z","lastTransitionTime":"2026-03-18T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.984959 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/1.log" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.988650 5008 scope.go:117] "RemoveContainer" containerID="af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29" Mar 18 18:04:07 crc kubenswrapper[5008]: E0318 18:04:07.988795 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.992333 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" event={"ID":"ff5b1f8d-21ca-4a18-952a-bbc202aeb521","Type":"ContainerStarted","Data":"8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.992414 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" event={"ID":"ff5b1f8d-21ca-4a18-952a-bbc202aeb521","Type":"ContainerStarted","Data":"cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077"} Mar 18 18:04:07 crc kubenswrapper[5008]: I0318 18:04:07.992431 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" event={"ID":"ff5b1f8d-21ca-4a18-952a-bbc202aeb521","Type":"ContainerStarted","Data":"6378a1277b421723fb97659c5b3a4862916737c879f4666855d82bc8128228f6"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.005373 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.005615 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.005657 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.005668 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.005687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.005700 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.009982 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.010113 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.010163 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:04:40.010126642 +0000 UTC m=+136.529599761 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.010228 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.010248 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.010285 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:40.010266356 +0000 UTC m=+136.529739525 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.010387 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.010411 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.010412 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.010424 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.010509 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.010604 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.010623 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:40.010584594 +0000 UTC m=+136.530057673 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.010695 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.011135 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:40.011121058 +0000 UTC m=+136.530594257 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.011211 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.011287 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs podName:1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:09.011264412 +0000 UTC m=+105.530737511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs") pod "network-metrics-daemon-g2z9p" (UID: "1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.011365 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.011381 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.011398 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.011433 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:40.011422816 +0000 UTC m=+136.530895905 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.025030 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.044215 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.061136 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.076676 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.095072 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.110200 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.110276 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.110293 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.110319 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.110343 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.130623 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.155608 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.176485 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.195125 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.197349 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.197395 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.197477 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.197593 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.197758 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:08 crc kubenswrapper[5008]: E0318 18:04:08.197915 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.214410 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.214484 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.214510 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.214542 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.214599 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.218797 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.240247 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.255917 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.276605 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.299224 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071522 6973 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071416 6973 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 18:04:06.071646 6973 ovnkube.go:599] Stopped ovnkube\\\\nI0318 18:04:06.071669 6973 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 18:04:06.071751 6973 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.313721 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.322900 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.322950 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.322963 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.323022 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.323035 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.331243 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.349941 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.380682 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071522 6973 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071416 6973 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 18:04:06.071646 6973 ovnkube.go:599] Stopped ovnkube\\\\nI0318 18:04:06.071669 6973 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 18:04:06.071751 6973 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.395844 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.416346 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.426231 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.426328 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.426346 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.426371 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.426393 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.429790 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.444472 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.460853 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.475017 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.490807 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.509097 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.527715 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.529014 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.529059 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.529068 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.529084 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.529093 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.546227 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.557657 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.568703 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.599403 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.616655 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.631682 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.631809 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.631865 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.631927 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.631949 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.632703 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:08Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.735081 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.735189 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.735240 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.735272 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.735288 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.838837 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.838898 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.838919 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.838944 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.838963 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.941961 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.942040 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.942060 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.942085 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:08 crc kubenswrapper[5008]: I0318 18:04:08.942106 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:08Z","lastTransitionTime":"2026-03-18T18:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.020817 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:09 crc kubenswrapper[5008]: E0318 18:04:09.021170 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:09 crc kubenswrapper[5008]: E0318 18:04:09.021289 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs podName:1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:11.021255206 +0000 UTC m=+107.540728335 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs") pod "network-metrics-daemon-g2z9p" (UID: "1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.087699 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.087777 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.087804 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.087837 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.087858 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.191778 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.191844 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.191862 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.191887 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.191904 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.197945 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:09 crc kubenswrapper[5008]: E0318 18:04:09.198116 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.294742 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.294820 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.294846 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.294873 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.294893 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.398390 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.398508 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.398531 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.398587 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.398605 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.501867 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.501954 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.501971 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.501995 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.502014 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.604923 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.604981 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.604999 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.605033 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.605053 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.708323 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.708382 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.708398 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.708425 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.708451 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.810985 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.811044 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.811067 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.811094 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.811116 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.914819 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.914876 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.914934 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.914961 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:09 crc kubenswrapper[5008]: I0318 18:04:09.914982 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:09Z","lastTransitionTime":"2026-03-18T18:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.018971 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.019075 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.019114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.019155 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.019184 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.122957 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.123015 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.123037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.123065 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.123087 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.198213 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:10 crc kubenswrapper[5008]: E0318 18:04:10.198397 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.198798 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.198849 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:10 crc kubenswrapper[5008]: E0318 18:04:10.199393 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:10 crc kubenswrapper[5008]: E0318 18:04:10.199612 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.225687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.225747 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.225765 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.225787 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.225806 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.328506 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.328869 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.329005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.329172 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.329379 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.433009 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.433077 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.433091 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.433116 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.433133 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.536374 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.536439 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.536459 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.536486 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.536503 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.639657 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.639700 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.639715 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.639765 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.639782 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.741769 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.741808 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.741819 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.741836 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.741847 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.844632 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.845045 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.845245 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.845451 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.845719 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.949279 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.949360 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.949385 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.949413 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:10 crc kubenswrapper[5008]: I0318 18:04:10.949432 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:10Z","lastTransitionTime":"2026-03-18T18:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.043745 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:11 crc kubenswrapper[5008]: E0318 18:04:11.043963 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:11 crc kubenswrapper[5008]: E0318 18:04:11.044087 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs podName:1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:15.044057851 +0000 UTC m=+111.563530960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs") pod "network-metrics-daemon-g2z9p" (UID: "1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.052850 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.052904 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.052921 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.052948 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.052966 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.156641 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.156688 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.156701 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.156720 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.156735 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.197746 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:11 crc kubenswrapper[5008]: E0318 18:04:11.197962 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.259549 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.259653 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.259674 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.259701 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.259733 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.362670 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.362722 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.362743 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.362766 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.362787 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.466224 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.466284 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.466301 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.466328 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.466348 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.570452 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.570500 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.570519 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.570539 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.570576 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.674463 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.674536 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.674606 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.674636 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.674656 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.778240 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.778327 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.778350 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.778376 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.778394 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.881713 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.881766 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.881782 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.881803 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.881817 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.984791 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.984874 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.984900 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.984931 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:11 crc kubenswrapper[5008]: I0318 18:04:11.984952 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:11Z","lastTransitionTime":"2026-03-18T18:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.088538 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.088644 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.088663 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.088689 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.088706 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.191923 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.191970 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.191986 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.192008 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.192023 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.198242 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.198406 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.198277 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:12 crc kubenswrapper[5008]: E0318 18:04:12.198969 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:12 crc kubenswrapper[5008]: E0318 18:04:12.198844 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:12 crc kubenswrapper[5008]: E0318 18:04:12.198652 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.295192 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.295243 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.295260 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.295283 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.295300 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.397961 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.398238 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.398341 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.398499 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.398653 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.502201 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.502264 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.502281 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.502307 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.502324 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.605820 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.605911 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.605931 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.605955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.605973 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.709005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.709312 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.709489 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.709656 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.709775 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.812643 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.813044 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.813266 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.813638 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.813785 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.917346 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.917728 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.917948 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.918133 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:12 crc kubenswrapper[5008]: I0318 18:04:12.918460 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:12Z","lastTransitionTime":"2026-03-18T18:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.021295 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.021343 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.021360 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.021378 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.021390 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.124681 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.124779 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.124798 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.124821 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.124837 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.197801 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:13 crc kubenswrapper[5008]: E0318 18:04:13.199067 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.228301 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.228660 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.228911 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.229225 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.229515 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.333117 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.333180 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.333201 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.333233 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.333254 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.436073 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.436320 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.436419 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.436523 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.436655 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.539005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.539363 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.539455 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.539541 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.539673 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.642690 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.642756 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.642774 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.642805 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.642823 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.746146 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.746399 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.746461 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.746532 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.746688 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.849235 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.849302 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.849316 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.849338 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.849355 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.953107 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.953181 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.953202 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.953234 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:13 crc kubenswrapper[5008]: I0318 18:04:13.953261 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:13Z","lastTransitionTime":"2026-03-18T18:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.056429 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.056493 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.056512 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.056537 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.056595 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.159758 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.159828 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.159848 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.159876 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.159917 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.197641 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.197696 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.197778 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:14 crc kubenswrapper[5008]: E0318 18:04:14.197876 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:14 crc kubenswrapper[5008]: E0318 18:04:14.198096 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:14 crc kubenswrapper[5008]: E0318 18:04:14.198292 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.226685 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.250439 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.263231 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.263532 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.263807 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.264000 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.264200 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.274010 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.292949 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.320427 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.353785 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071522 6973 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071416 6973 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 18:04:06.071646 6973 ovnkube.go:599] Stopped ovnkube\\\\nI0318 18:04:06.071669 6973 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 18:04:06.071751 6973 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.368015 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.368494 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.368714 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.368878 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.369046 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.369210 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.389216 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.408536 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.432242 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.452644 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.469963 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.472640 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.472685 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.472707 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.472739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.472762 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.488364 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.506011 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.545390 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.566396 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.575894 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.575963 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.575984 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.576011 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.576031 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.584395 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:14Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.679630 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.679717 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.679746 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.679779 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.679804 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.784300 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.784374 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.784401 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.784433 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.784460 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.901077 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.901139 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.901159 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.901187 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:14 crc kubenswrapper[5008]: I0318 18:04:14.901206 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:14Z","lastTransitionTime":"2026-03-18T18:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.005092 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.005165 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.005188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.005218 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.005241 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.101976 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:15 crc kubenswrapper[5008]: E0318 18:04:15.102424 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:15 crc kubenswrapper[5008]: E0318 18:04:15.102703 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs podName:1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:23.10265142 +0000 UTC m=+119.622124659 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs") pod "network-metrics-daemon-g2z9p" (UID: "1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.112675 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.112745 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.112767 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.112794 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.112812 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.197660 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:15 crc kubenswrapper[5008]: E0318 18:04:15.197933 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.216968 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.217056 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.217074 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.217102 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.217247 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.274218 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.274294 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.274317 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.274348 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.274368 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: E0318 18:04:15.302456 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:15Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.308757 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.308813 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.308832 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.308854 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.308872 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: E0318 18:04:15.330641 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:15Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.335685 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.335727 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.335743 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.335763 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.335779 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: E0318 18:04:15.355721 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:15Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.362912 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.362991 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.363012 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.363044 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.363066 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: E0318 18:04:15.379506 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:15Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.384971 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.385045 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.385063 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.385090 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.385109 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: E0318 18:04:15.409990 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:15Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:15 crc kubenswrapper[5008]: E0318 18:04:15.410139 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.412388 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.412458 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.412472 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.412495 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.412511 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.516797 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.516873 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.516890 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.516915 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.516935 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.619816 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.619924 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.619950 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.619979 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.620000 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.723029 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.723114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.723145 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.723176 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.723196 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.826644 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.826709 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.826728 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.826755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.826775 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.930478 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.930545 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.930619 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.930664 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:15 crc kubenswrapper[5008]: I0318 18:04:15.930693 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:15Z","lastTransitionTime":"2026-03-18T18:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.033689 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.033742 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.033757 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.033781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.033796 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.137439 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.137523 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.137543 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.137614 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.137639 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.198433 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.198463 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.198647 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:16 crc kubenswrapper[5008]: E0318 18:04:16.198818 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:16 crc kubenswrapper[5008]: E0318 18:04:16.198964 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:16 crc kubenswrapper[5008]: E0318 18:04:16.199069 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.240283 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.240331 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.240341 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.240360 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.240371 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.344100 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.344163 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.344173 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.344193 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.344204 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.448929 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.448992 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.449004 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.449026 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.449361 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.551800 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.551872 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.551892 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.551920 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.551941 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.654795 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.654851 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.654862 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.654881 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.654895 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.758536 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.758619 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.758634 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.758653 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.758668 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.862027 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.862093 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.862112 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.862137 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.862154 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.965504 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.965598 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.965618 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.965650 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:16 crc kubenswrapper[5008]: I0318 18:04:16.965673 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:16Z","lastTransitionTime":"2026-03-18T18:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.069778 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.069839 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.069848 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.069861 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.069889 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.173291 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.173351 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.173363 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.173385 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.173399 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.198187 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:17 crc kubenswrapper[5008]: E0318 18:04:17.198401 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.277826 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.278004 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.278028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.278058 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.278113 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.382453 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.382569 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.382583 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.382602 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.382615 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.486707 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.486781 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.486798 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.486824 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.486841 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.590087 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.590147 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.590165 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.590192 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.590214 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.694134 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.694211 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.694230 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.694259 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.694278 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.797628 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.797685 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.797697 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.797717 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.797732 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.900634 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.900703 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.900715 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.900739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:17 crc kubenswrapper[5008]: I0318 18:04:17.900759 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:17Z","lastTransitionTime":"2026-03-18T18:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.004482 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.004605 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.004626 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.004656 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.004678 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.108168 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.108216 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.108231 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.108250 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.108263 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.198076 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.198188 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:18 crc kubenswrapper[5008]: E0318 18:04:18.198440 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.198453 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:18 crc kubenswrapper[5008]: E0318 18:04:18.198675 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:18 crc kubenswrapper[5008]: E0318 18:04:18.198760 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.199785 5008 scope.go:117] "RemoveContainer" containerID="1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.210964 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.211039 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.211054 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.211076 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.211096 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.315751 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.315829 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.315852 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.315877 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.315898 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.418898 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.418941 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.418955 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.418978 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.418997 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.521454 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.521506 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.521522 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.521545 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.521593 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.625372 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.625439 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.625457 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.625490 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.625509 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.728229 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.728278 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.728297 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.728319 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.728339 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.831531 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.831623 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.831645 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.831677 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.831700 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.934086 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.934146 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.934164 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.934188 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:18 crc kubenswrapper[5008]: I0318 18:04:18.934207 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:18Z","lastTransitionTime":"2026-03-18T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.036424 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.036480 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.036496 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.036518 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.036535 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.038204 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.040664 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.041087 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.062094 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.084211 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.115817 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071522 6973 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071416 6973 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 18:04:06.071646 6973 ovnkube.go:599] Stopped ovnkube\\\\nI0318 18:04:06.071669 6973 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 18:04:06.071751 6973 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.131921 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.139607 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.139681 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.139707 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.139738 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.139761 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.151331 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.168142 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.188502 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.198215 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:19 crc kubenswrapper[5008]: E0318 18:04:19.198598 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.205261 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.206102 5008 scope.go:117] "RemoveContainer" containerID="af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.243607 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.243679 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.243701 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.243726 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.243745 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.247598 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.264359 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.289432 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.308365 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.330107 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.343812 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.346707 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.346748 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.346756 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.346773 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.346783 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.357146 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.383752 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.407429 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:19Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.449616 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.449689 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.449720 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.450506 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.450828 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.554013 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.554098 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.554116 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.554470 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.554811 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.658318 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.658430 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.658449 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.658478 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.658497 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.761877 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.761932 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.761952 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.761977 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.761995 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.864760 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.864813 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.864830 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.864854 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.864870 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.967785 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.967826 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.967837 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.967854 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:19 crc kubenswrapper[5008]: I0318 18:04:19.967866 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:19Z","lastTransitionTime":"2026-03-18T18:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.046856 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/1.log" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.050598 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.050837 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.061002 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.069624 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.069664 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.069671 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.069687 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.069696 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.073117 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.082830 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.098019 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.117930 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071522 6973 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071416 6973 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 18:04:06.071646 6973 ovnkube.go:599] Stopped ovnkube\\\\nI0318 18:04:06.071669 6973 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 18:04:06.071751 6973 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.132887 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.145518 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.159515 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.170269 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.172274 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.172310 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.172323 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.172343 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.172358 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.180514 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.190363 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.197411 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.197518 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:20 crc kubenswrapper[5008]: E0318 18:04:20.197667 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.197744 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:20 crc kubenswrapper[5008]: E0318 18:04:20.197889 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:20 crc kubenswrapper[5008]: E0318 18:04:20.198010 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.200784 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.217986 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.232543 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.244933 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.274814 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.274859 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.274871 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.274889 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.274904 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.288215 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.303097 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.377817 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.377864 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.377879 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.377902 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.377917 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.480390 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.480445 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.480457 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.480476 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.480492 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.584073 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.584165 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.584185 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.584228 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.584257 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.688289 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.688355 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.688375 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.688402 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.688421 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.791849 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.791917 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.791937 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.791965 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.791985 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.895668 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.895719 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.895731 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.895749 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.895760 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.999065 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.999114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.999124 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.999142 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:20 crc kubenswrapper[5008]: I0318 18:04:20.999154 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:20Z","lastTransitionTime":"2026-03-18T18:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.056837 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/2.log" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.057795 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/1.log" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.061785 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924" exitCode=1 Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.061845 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.061933 5008 scope.go:117] "RemoveContainer" containerID="af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.063104 5008 scope.go:117] "RemoveContainer" containerID="357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924" Mar 18 18:04:21 crc kubenswrapper[5008]: E0318 18:04:21.063411 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.086032 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.102160 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.102222 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.102243 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.102268 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.102287 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:21Z","lastTransitionTime":"2026-03-18T18:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.108712 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.133550 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.167164 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9bc80717a71dbc5cbab0c482529d5c5f7c9fe3038a8ec47cdaea7cfd27ad29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071522 6973 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 18:04:06.071416 6973 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0318 18:04:06.071646 6973 ovnkube.go:599] Stopped ovnkube\\\\nI0318 18:04:06.071669 6973 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 18:04:06.071751 6973 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.184174 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.197533 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:21 crc kubenswrapper[5008]: E0318 18:04:21.197739 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.204824 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.205394 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.205662 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.205888 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.206075 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.206272 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:21Z","lastTransitionTime":"2026-03-18T18:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.220070 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.240182 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.257779 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.274522 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.292593 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.309233 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.309299 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.309317 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.309341 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.309361 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:21Z","lastTransitionTime":"2026-03-18T18:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.317286 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.336479 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.359479 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.376760 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.392323 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.412423 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.412493 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.412512 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.412537 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.412601 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:21Z","lastTransitionTime":"2026-03-18T18:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.417515 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:21Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.514791 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.514830 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.514843 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.514858 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.514870 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:21Z","lastTransitionTime":"2026-03-18T18:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.618889 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.618949 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.620217 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.620267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.620287 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:21Z","lastTransitionTime":"2026-03-18T18:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.723042 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.723089 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.723106 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.723128 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.723146 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:21Z","lastTransitionTime":"2026-03-18T18:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.826362 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.826415 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.826431 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.826454 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.826471 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:21Z","lastTransitionTime":"2026-03-18T18:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.929970 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.930037 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.930057 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.930081 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:21 crc kubenswrapper[5008]: I0318 18:04:21.930098 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:21Z","lastTransitionTime":"2026-03-18T18:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.032400 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.032462 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.032479 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.032503 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.032520 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.071913 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/2.log" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.077539 5008 scope.go:117] "RemoveContainer" containerID="357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924" Mar 18 18:04:22 crc kubenswrapper[5008]: E0318 18:04:22.077899 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.095602 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.114024 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.134396 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.137222 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.137267 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.137288 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.137314 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.137336 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.153614 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.172073 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.190862 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.202532 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:22 crc kubenswrapper[5008]: E0318 18:04:22.203594 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.203717 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.203717 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:22 crc kubenswrapper[5008]: E0318 18:04:22.203900 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:22 crc kubenswrapper[5008]: E0318 18:04:22.204011 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.209037 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.238042 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.241372 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.241432 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.241450 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.241474 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.241492 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.262367 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.283004 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.304482 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.324515 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.342833 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.345636 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.345694 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.345713 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.345737 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.345756 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.358791 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.376850 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.406705 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.419711 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:22Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.448108 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.448173 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.448190 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.448214 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.448232 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.553545 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.554083 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.554095 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.554115 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.554129 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.656895 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.656972 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.656993 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.657019 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.657038 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.761093 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.761153 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.761170 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.761195 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.761215 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.863838 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.863880 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.863892 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.863910 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.863922 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.966775 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.966821 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.966832 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.966850 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:22 crc kubenswrapper[5008]: I0318 18:04:22.966864 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:22Z","lastTransitionTime":"2026-03-18T18:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.069855 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.069926 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.069945 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.069972 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.069993 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:23Z","lastTransitionTime":"2026-03-18T18:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.106793 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:23 crc kubenswrapper[5008]: E0318 18:04:23.107097 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:23 crc kubenswrapper[5008]: E0318 18:04:23.107251 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs podName:1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7 nodeName:}" failed. No retries permitted until 2026-03-18 18:04:39.107212174 +0000 UTC m=+135.626685293 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs") pod "network-metrics-daemon-g2z9p" (UID: "1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.173501 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.173595 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.173615 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.173649 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.173670 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:23Z","lastTransitionTime":"2026-03-18T18:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.197943 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:23 crc kubenswrapper[5008]: E0318 18:04:23.198277 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.277123 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.277193 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.277213 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.277241 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.277261 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:23Z","lastTransitionTime":"2026-03-18T18:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.380284 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.380353 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.380371 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.380396 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.380416 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:23Z","lastTransitionTime":"2026-03-18T18:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.483945 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.484009 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.484027 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.484049 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.484109 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:23Z","lastTransitionTime":"2026-03-18T18:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.587505 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.587664 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.587691 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.587722 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.587742 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:23Z","lastTransitionTime":"2026-03-18T18:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.690469 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.690590 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.690617 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.690651 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.690676 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:23Z","lastTransitionTime":"2026-03-18T18:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.793625 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.793694 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.793718 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.793748 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.793770 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:23Z","lastTransitionTime":"2026-03-18T18:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.896365 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.896450 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.896478 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.896509 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:23 crc kubenswrapper[5008]: I0318 18:04:23.896532 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:23Z","lastTransitionTime":"2026-03-18T18:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.000511 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.000633 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.000654 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.000690 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.000713 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:24Z","lastTransitionTime":"2026-03-18T18:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.103999 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.104062 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.104079 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.104110 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.104132 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:24Z","lastTransitionTime":"2026-03-18T18:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.198356 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.198405 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:24 crc kubenswrapper[5008]: E0318 18:04:24.198736 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.198838 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:24 crc kubenswrapper[5008]: E0318 18:04:24.199064 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:24 crc kubenswrapper[5008]: E0318 18:04:24.199927 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:24 crc kubenswrapper[5008]: E0318 18:04:24.205507 5008 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.224323 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.240052 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.256694 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.270897 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.285826 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.304993 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: E0318 18:04:24.313106 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.329489 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.347104 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.362446 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.384940 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.397945 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.416835 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.435079 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.465824 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.478792 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.497214 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:24 crc kubenswrapper[5008]: I0318 18:04:24.512500 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:24Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.197451 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:25 crc kubenswrapper[5008]: E0318 18:04:25.197724 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.767114 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.767593 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.767621 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.767661 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.767681 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:25Z","lastTransitionTime":"2026-03-18T18:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:25 crc kubenswrapper[5008]: E0318 18:04:25.794342 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:25Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.801613 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.801692 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.801710 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.801735 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.801753 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:25Z","lastTransitionTime":"2026-03-18T18:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:25 crc kubenswrapper[5008]: E0318 18:04:25.822843 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:25Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.828637 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.828739 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.828768 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.828799 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.828824 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:25Z","lastTransitionTime":"2026-03-18T18:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:25 crc kubenswrapper[5008]: E0318 18:04:25.846340 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:25Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.851444 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.851496 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.851511 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.851529 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.851541 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:25Z","lastTransitionTime":"2026-03-18T18:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:25 crc kubenswrapper[5008]: E0318 18:04:25.864453 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:25Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.869271 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.869330 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.869347 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.869371 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:25 crc kubenswrapper[5008]: I0318 18:04:25.869391 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:25Z","lastTransitionTime":"2026-03-18T18:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:25 crc kubenswrapper[5008]: E0318 18:04:25.888009 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:25Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:25 crc kubenswrapper[5008]: E0318 18:04:25.888238 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:04:26 crc kubenswrapper[5008]: I0318 18:04:26.198269 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:26 crc kubenswrapper[5008]: I0318 18:04:26.198324 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:26 crc kubenswrapper[5008]: E0318 18:04:26.198433 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:26 crc kubenswrapper[5008]: I0318 18:04:26.198458 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:26 crc kubenswrapper[5008]: E0318 18:04:26.198590 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:26 crc kubenswrapper[5008]: E0318 18:04:26.198677 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:27 crc kubenswrapper[5008]: I0318 18:04:27.197583 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:27 crc kubenswrapper[5008]: E0318 18:04:27.197790 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:28 crc kubenswrapper[5008]: I0318 18:04:28.197944 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:28 crc kubenswrapper[5008]: I0318 18:04:28.197990 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:28 crc kubenswrapper[5008]: E0318 18:04:28.198137 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:28 crc kubenswrapper[5008]: I0318 18:04:28.198195 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:28 crc kubenswrapper[5008]: E0318 18:04:28.198312 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:28 crc kubenswrapper[5008]: E0318 18:04:28.198404 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:29 crc kubenswrapper[5008]: I0318 18:04:29.198155 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:29 crc kubenswrapper[5008]: E0318 18:04:29.198366 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:29 crc kubenswrapper[5008]: E0318 18:04:29.314689 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:04:30 crc kubenswrapper[5008]: I0318 18:04:30.198152 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:30 crc kubenswrapper[5008]: I0318 18:04:30.198172 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:30 crc kubenswrapper[5008]: E0318 18:04:30.198465 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:30 crc kubenswrapper[5008]: I0318 18:04:30.198209 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:30 crc kubenswrapper[5008]: E0318 18:04:30.198662 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:30 crc kubenswrapper[5008]: E0318 18:04:30.198945 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:31 crc kubenswrapper[5008]: I0318 18:04:31.198058 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:31 crc kubenswrapper[5008]: E0318 18:04:31.198269 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.197911 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.198034 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:32 crc kubenswrapper[5008]: E0318 18:04:32.198199 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.198237 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:32 crc kubenswrapper[5008]: E0318 18:04:32.198767 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:32 crc kubenswrapper[5008]: E0318 18:04:32.198931 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.200526 5008 scope.go:117] "RemoveContainer" containerID="357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924" Mar 18 18:04:32 crc kubenswrapper[5008]: E0318 18:04:32.200906 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.220014 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.472355 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.492692 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.525409 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.538677 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.555761 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.570853 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.590420 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.605371 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.620750 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.639543 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.657851 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.675270 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.693480 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.706545 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.718273 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.752075 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.770970 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.786263 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:32 crc kubenswrapper[5008]: I0318 18:04:32.802834 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70cdd8-b09d-497b-b12a-fe8f7c9c28cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:33 crc kubenswrapper[5008]: I0318 18:04:33.200522 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:33 crc kubenswrapper[5008]: E0318 18:04:33.200792 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.197679 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:34 crc kubenswrapper[5008]: E0318 18:04:34.197896 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.197984 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.198046 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:34 crc kubenswrapper[5008]: E0318 18:04:34.198256 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:34 crc kubenswrapper[5008]: E0318 18:04:34.198425 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.216079 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.236284 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.248385 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.261879 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.277078 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.292356 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.303421 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: E0318 18:04:34.315384 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.333710 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.350530 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.362741 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.375516 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70cdd8-b09d-497b-b12a-fe8f7c9c28cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.392097 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.408833 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.428687 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.441323 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.507845 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.535484 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:34 crc kubenswrapper[5008]: I0318 18:04:34.551179 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:35 crc kubenswrapper[5008]: I0318 18:04:35.197445 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:35 crc kubenswrapper[5008]: E0318 18:04:35.197680 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.139413 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.139469 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.139480 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.139495 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.139506 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:36Z","lastTransitionTime":"2026-03-18T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:36 crc kubenswrapper[5008]: E0318 18:04:36.150889 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.154978 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.155027 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.155040 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.155057 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.155069 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:36Z","lastTransitionTime":"2026-03-18T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:36 crc kubenswrapper[5008]: E0318 18:04:36.172721 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.177195 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.177244 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.177263 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.177287 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.177305 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:36Z","lastTransitionTime":"2026-03-18T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:36 crc kubenswrapper[5008]: E0318 18:04:36.189394 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.192732 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.192763 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.192772 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.192786 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.192796 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:36Z","lastTransitionTime":"2026-03-18T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.198091 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.198158 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.198183 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:36 crc kubenswrapper[5008]: E0318 18:04:36.198275 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:36 crc kubenswrapper[5008]: E0318 18:04:36.198365 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:36 crc kubenswrapper[5008]: E0318 18:04:36.198465 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:36 crc kubenswrapper[5008]: E0318 18:04:36.206882 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.210649 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.210690 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.210703 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.210719 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:36 crc kubenswrapper[5008]: I0318 18:04:36.210733 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:36Z","lastTransitionTime":"2026-03-18T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:36 crc kubenswrapper[5008]: E0318 18:04:36.223706 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:36 crc kubenswrapper[5008]: E0318 18:04:36.223927 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:04:37 crc kubenswrapper[5008]: I0318 18:04:37.198004 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:37 crc kubenswrapper[5008]: E0318 18:04:37.198130 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:38 crc kubenswrapper[5008]: I0318 18:04:38.366047 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:38 crc kubenswrapper[5008]: E0318 18:04:38.366185 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:38 crc kubenswrapper[5008]: I0318 18:04:38.366398 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:38 crc kubenswrapper[5008]: E0318 18:04:38.366466 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:38 crc kubenswrapper[5008]: I0318 18:04:38.366747 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:38 crc kubenswrapper[5008]: E0318 18:04:38.367012 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:39 crc kubenswrapper[5008]: I0318 18:04:39.171051 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:39 crc kubenswrapper[5008]: E0318 18:04:39.171227 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:39 crc kubenswrapper[5008]: E0318 18:04:39.171287 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs podName:1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7 nodeName:}" failed. No retries permitted until 2026-03-18 18:05:11.171270286 +0000 UTC m=+167.690743375 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs") pod "network-metrics-daemon-g2z9p" (UID: "1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:04:39 crc kubenswrapper[5008]: I0318 18:04:39.198059 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:39 crc kubenswrapper[5008]: E0318 18:04:39.198285 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:39 crc kubenswrapper[5008]: E0318 18:04:39.316494 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:04:40 crc kubenswrapper[5008]: I0318 18:04:40.078016 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078223 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:44.078182837 +0000 UTC m=+200.597655956 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:04:40 crc kubenswrapper[5008]: I0318 18:04:40.078350 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:40 crc kubenswrapper[5008]: I0318 18:04:40.078433 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:40 crc kubenswrapper[5008]: I0318 18:04:40.078483 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:40 crc kubenswrapper[5008]: I0318 18:04:40.078551 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078679 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078704 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078720 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078755 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:05:44.078739192 +0000 UTC m=+200.598212311 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078757 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078790 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078814 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078873 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078895 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.078831 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:05:44.078798183 +0000 UTC m=+200.598271302 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.079040 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:05:44.078991898 +0000 UTC m=+200.598465007 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.079085 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:05:44.079070741 +0000 UTC m=+200.598543860 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:04:40 crc kubenswrapper[5008]: I0318 18:04:40.198418 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:40 crc kubenswrapper[5008]: I0318 18:04:40.198414 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:40 crc kubenswrapper[5008]: I0318 18:04:40.198824 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.199044 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.199122 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:40 crc kubenswrapper[5008]: E0318 18:04:40.198847 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:40 crc kubenswrapper[5008]: I0318 18:04:40.213868 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.197703 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:41 crc kubenswrapper[5008]: E0318 18:04:41.197894 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.391200 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/0.log" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.391296 5008 generic.go:334] "Generic (PLEG): container finished" podID="9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd" containerID="4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9" exitCode=1 Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.391342 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgv8s" event={"ID":"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd","Type":"ContainerDied","Data":"4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9"} Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.391886 5008 scope.go:117] "RemoveContainer" containerID="4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.430435 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.446080 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.469088 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.487029 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.509416 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.532083 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.554627 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.571543 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.591510 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.610982 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.626277 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.642872 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.660173 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.693348 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.707755 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.719683 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:41Z\\\",\\\"message\\\":\\\"2026-03-18T18:03:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903\\\\n2026-03-18T18:03:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903 to /host/opt/cni/bin/\\\\n2026-03-18T18:03:56Z [verbose] multus-daemon started\\\\n2026-03-18T18:03:56Z [verbose] Readiness Indicator file check\\\\n2026-03-18T18:04:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.736006 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8381f3c4-dca2-43ad-90cb-80fdd24d1397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af028acb3982bdc511e57661b10a59b3488f1c244edfb1e241d86fc56d05aa4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:02:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 18:02:26.495367 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 18:02:26.497854 1 observer_polling.go:159] Starting file observer\\\\nI0318 18:02:26.536974 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 18:02:26.542824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 18:02:54.270280 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 18:02:54.270386 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2101ff8ac77ab3c2ad89a85919eadea6336e52c1fa8fa35b30d2310a185a85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8d12a8adedc6347e344404eefcb3508701c3ae0c0c6a3c405a9b789262fd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.746284 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70cdd8-b09d-497b-b12a-fe8f7c9c28cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:41 crc kubenswrapper[5008]: I0318 18:04:41.756960 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.197537 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.197755 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:42 crc kubenswrapper[5008]: E0318 18:04:42.198002 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.198092 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:42 crc kubenswrapper[5008]: E0318 18:04:42.198221 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:42 crc kubenswrapper[5008]: E0318 18:04:42.198349 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.399046 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/0.log" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.399137 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgv8s" event={"ID":"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd","Type":"ContainerStarted","Data":"49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b"} Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.420845 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:41Z\\\",\\\"message\\\":\\\"2026-03-18T18:03:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903\\\\n2026-03-18T18:03:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903 to /host/opt/cni/bin/\\\\n2026-03-18T18:03:56Z [verbose] multus-daemon started\\\\n2026-03-18T18:03:56Z [verbose] Readiness Indicator file check\\\\n2026-03-18T18:04:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.441138 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8381f3c4-dca2-43ad-90cb-80fdd24d1397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af028acb3982bdc511e57661b10a59b3488f1c244edfb1e241d86fc56d05aa4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:02:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 18:02:26.495367 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 18:02:26.497854 1 observer_polling.go:159] Starting file observer\\\\nI0318 18:02:26.536974 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 18:02:26.542824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 18:02:54.270280 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 18:02:54.270386 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2101ff8ac77ab3c2ad89a85919eadea6336e52c1fa8fa35b30d2310a185a85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8d12a8adedc6347e344404eefcb3508701c3ae0c0c6a3c405a9b789262fd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.461687 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70cdd8-b09d-497b-b12a-fe8f7c9c28cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.483892 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.516334 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.530912 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.550059 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.564473 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.585146 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.603179 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.619179 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.633152 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.648481 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.667100 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.686416 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.701623 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.714076 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.745219 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:42 crc kubenswrapper[5008]: I0318 18:04:42.761578 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:42Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:43 crc kubenswrapper[5008]: I0318 18:04:43.197773 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:43 crc kubenswrapper[5008]: E0318 18:04:43.198027 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.197821 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.197893 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:44 crc kubenswrapper[5008]: E0318 18:04:44.198167 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.198335 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:44 crc kubenswrapper[5008]: E0318 18:04:44.198532 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:44 crc kubenswrapper[5008]: E0318 18:04:44.198694 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.199837 5008 scope.go:117] "RemoveContainer" containerID="357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.224353 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8381f3c4-dca2-43ad-90cb-80fdd24d1397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af028acb3982bdc511e57661b10a59b3488f1c244edfb1e241d86fc56d05aa4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:02:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 18:02:26.495367 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 18:02:26.497854 1 observer_polling.go:159] Starting file observer\\\\nI0318 18:02:26.536974 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 18:02:26.542824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 18:02:54.270280 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 18:02:54.270386 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2101ff8ac77ab3c2ad89a85919eadea6336e52c1fa8fa35b30d2310a185a85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8d12a8adedc6347e344404eefcb3508701c3ae0c0c6a3c405a9b789262fd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.251799 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70cdd8-b09d-497b-b12a-fe8f7c9c28cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.278295 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.297297 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:41Z\\\",\\\"message\\\":\\\"2026-03-18T18:03:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903\\\\n2026-03-18T18:03:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903 to /host/opt/cni/bin/\\\\n2026-03-18T18:03:56Z [verbose] multus-daemon started\\\\n2026-03-18T18:03:56Z [verbose] Readiness Indicator file check\\\\n2026-03-18T18:04:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.311644 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: E0318 18:04:44.317804 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.323135 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.340443 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.359058 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.367693 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.387528 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.402730 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.409070 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/2.log" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.411736 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.412471 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.420530 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.435599 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.447777 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.457983 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.468386 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.486323 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.504669 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.515020 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.525766 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.536787 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.551383 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.570576 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.582045 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.597602 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.613881 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.645268 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.667631 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.678470 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.692297 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8381f3c4-dca2-43ad-90cb-80fdd24d1397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af028acb3982bdc511e57661b10a59b3488f1c244edfb1e241d86fc56d05aa4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:02:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 18:02:26.495367 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 18:02:26.497854 1 observer_polling.go:159] Starting file observer\\\\nI0318 18:02:26.536974 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 18:02:26.542824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 18:02:54.270280 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 18:02:54.270386 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2101ff8ac77ab3c2ad89a85919eadea6336e52c1fa8fa35b30d2310a185a85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8d12a8adedc6347e344404eefcb3508701c3ae0c0c6a3c405a9b789262fd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.703147 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70cdd8-b09d-497b-b12a-fe8f7c9c28cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.714480 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.725895 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:41Z\\\",\\\"message\\\":\\\"2026-03-18T18:03:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903\\\\n2026-03-18T18:03:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903 to /host/opt/cni/bin/\\\\n2026-03-18T18:03:56Z [verbose] multus-daemon started\\\\n2026-03-18T18:03:56Z [verbose] Readiness Indicator file check\\\\n2026-03-18T18:04:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.736812 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.745420 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.757076 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.771401 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:44 crc kubenswrapper[5008]: I0318 18:04:44.780986 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.197831 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:45 crc kubenswrapper[5008]: E0318 18:04:45.198026 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.420896 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/3.log" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.421726 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/2.log" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.424465 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" exitCode=1 Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.424517 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.424618 5008 scope.go:117] "RemoveContainer" containerID="357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.425712 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:04:45 crc kubenswrapper[5008]: E0318 18:04:45.426027 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.440146 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.513343 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.525994 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.534933 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.547057 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8381f3c4-dca2-43ad-90cb-80fdd24d1397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af028acb3982bdc511e57661b10a59b3488f1c244edfb1e241d86fc56d05aa4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:02:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 18:02:26.495367 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 18:02:26.497854 1 observer_polling.go:159] Starting file observer\\\\nI0318 18:02:26.536974 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 18:02:26.542824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 18:02:54.270280 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 18:02:54.270386 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2101ff8ac77ab3c2ad89a85919eadea6336e52c1fa8fa35b30d2310a185a85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8d12a8adedc6347e344404eefcb3508701c3ae0c0c6a3c405a9b789262fd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.560533 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70cdd8-b09d-497b-b12a-fe8f7c9c28cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.574192 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.588036 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:41Z\\\",\\\"message\\\":\\\"2026-03-18T18:03:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903\\\\n2026-03-18T18:03:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903 to /host/opt/cni/bin/\\\\n2026-03-18T18:03:56Z [verbose] multus-daemon started\\\\n2026-03-18T18:03:56Z [verbose] Readiness Indicator file check\\\\n2026-03-18T18:04:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.600714 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.613838 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.633819 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.655379 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357da1e19579bcfe9c13a721d5742b9cf949f7654152a6e4b02c0589d045a924\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:20Z\\\",\\\"message\\\":\\\"failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:20Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:20.365953 7209 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365958 7209 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0318 18:04:20.365962 7209 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0318 18:04:20.365966 7209 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0318 18:04:20.365972 7209 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-5278w\\\\nI0318 18:04:20.365954 7209 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:45Z\\\",\\\"message\\\":\\\"handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:45.008700 7471 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\",\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.668178 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.682485 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.697823 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.715097 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.732985 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.744500 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:45 crc kubenswrapper[5008]: I0318 18:04:45.756187 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:45Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.197349 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.197464 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.197667 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.197694 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.197832 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.197969 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.431796 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/3.log" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.437357 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.437608 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.468717 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:45Z\\\",\\\"message\\\":\\\"handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:45.008700 7471 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\",\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.484931 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.504852 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.520594 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.544033 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.546185 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.546257 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.546279 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.546308 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.546330 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:46Z","lastTransitionTime":"2026-03-18T18:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.567294 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.571524 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.579976 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.580028 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.580045 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.580070 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.580089 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:46Z","lastTransitionTime":"2026-03-18T18:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.587288 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.597618 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.601903 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.601961 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.601978 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.602005 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.602024 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:46Z","lastTransitionTime":"2026-03-18T18:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.605797 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.617041 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.623019 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.623192 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.623323 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.623619 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.623654 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:46Z","lastTransitionTime":"2026-03-18T18:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.625885 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.639395 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.642836 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.647318 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.647364 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.647381 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.647404 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.647423 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:46Z","lastTransitionTime":"2026-03-18T18:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.654968 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.668740 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.668814 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: E0318 18:04:46.669605 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.683129 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.703295 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.723367 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.741576 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:41Z\\\",\\\"message\\\":\\\"2026-03-18T18:03:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903\\\\n2026-03-18T18:03:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903 to /host/opt/cni/bin/\\\\n2026-03-18T18:03:56Z [verbose] multus-daemon started\\\\n2026-03-18T18:03:56Z [verbose] Readiness Indicator file check\\\\n2026-03-18T18:04:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.759000 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8381f3c4-dca2-43ad-90cb-80fdd24d1397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af028acb3982bdc511e57661b10a59b3488f1c244edfb1e241d86fc56d05aa4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:02:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 18:02:26.495367 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 18:02:26.497854 1 observer_polling.go:159] Starting file observer\\\\nI0318 18:02:26.536974 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 18:02:26.542824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 18:02:54.270280 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 18:02:54.270386 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2101ff8ac77ab3c2ad89a85919eadea6336e52c1fa8fa35b30d2310a185a85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8d12a8adedc6347e344404eefcb3508701c3ae0c0c6a3c405a9b789262fd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.775870 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70cdd8-b09d-497b-b12a-fe8f7c9c28cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:46 crc kubenswrapper[5008]: I0318 18:04:46.794010 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:46Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:47 crc kubenswrapper[5008]: I0318 18:04:47.198012 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:47 crc kubenswrapper[5008]: E0318 18:04:47.198214 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:48 crc kubenswrapper[5008]: I0318 18:04:48.197630 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:48 crc kubenswrapper[5008]: I0318 18:04:48.197657 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:48 crc kubenswrapper[5008]: I0318 18:04:48.197761 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:48 crc kubenswrapper[5008]: E0318 18:04:48.197815 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:48 crc kubenswrapper[5008]: E0318 18:04:48.197990 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:48 crc kubenswrapper[5008]: E0318 18:04:48.198117 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:49 crc kubenswrapper[5008]: I0318 18:04:49.197539 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:49 crc kubenswrapper[5008]: E0318 18:04:49.197954 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:49 crc kubenswrapper[5008]: E0318 18:04:49.319005 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:04:50 crc kubenswrapper[5008]: I0318 18:04:50.198240 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:50 crc kubenswrapper[5008]: I0318 18:04:50.198341 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:50 crc kubenswrapper[5008]: E0318 18:04:50.198392 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:50 crc kubenswrapper[5008]: E0318 18:04:50.198587 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:50 crc kubenswrapper[5008]: I0318 18:04:50.198614 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:50 crc kubenswrapper[5008]: E0318 18:04:50.198759 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:51 crc kubenswrapper[5008]: I0318 18:04:51.197959 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:51 crc kubenswrapper[5008]: E0318 18:04:51.198176 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:52 crc kubenswrapper[5008]: I0318 18:04:52.197609 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:52 crc kubenswrapper[5008]: I0318 18:04:52.197728 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:52 crc kubenswrapper[5008]: E0318 18:04:52.197863 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:52 crc kubenswrapper[5008]: E0318 18:04:52.198001 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:52 crc kubenswrapper[5008]: I0318 18:04:52.198477 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:52 crc kubenswrapper[5008]: E0318 18:04:52.199309 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:53 crc kubenswrapper[5008]: I0318 18:04:53.197841 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:53 crc kubenswrapper[5008]: E0318 18:04:53.198147 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.197256 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.197366 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:54 crc kubenswrapper[5008]: E0318 18:04:54.197511 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.197542 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:54 crc kubenswrapper[5008]: E0318 18:04:54.197735 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:54 crc kubenswrapper[5008]: E0318 18:04:54.197930 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.218615 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8381f3c4-dca2-43ad-90cb-80fdd24d1397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af028acb3982bdc511e57661b10a59b3488f1c244edfb1e241d86fc56d05aa4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5972f5ca38303ef2ebf0480fb68cbe693f99f58909bd703a7e9b35d6b6b4d8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:02:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 18:02:26.495367 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 18:02:26.497854 1 observer_polling.go:159] Starting file observer\\\\nI0318 18:02:26.536974 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 18:02:26.542824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0318 18:02:54.270280 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0318 18:02:54.270386 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2101ff8ac77ab3c2ad89a85919eadea6336e52c1fa8fa35b30d2310a185a85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e8d12a8adedc6347e344404eefcb3508701c3ae0c0c6a3c405a9b789262fd5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.233985 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee70cdd8-b09d-497b-b12a-fe8f7c9c28cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55b1d1dd9c1f850855b50655cd769c4380c67ac5fcd7203eefc24d35e53bcb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3764491ec6fd4ded959c9f447badb396933fdb670769eece8e1371ce2df4288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b929d609244b89fe0628e9f9fde457d15fe0745a6fb11039befdd9b87fc7a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f86e1943ecfb15c8cd5fb96e7ea141e3756f9fbc8f7549ed4274c2937b248946\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.252862 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed627696de288acfdf8735c2ab209d000f4cdf5c239c0b1136a653a7ab6a41d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.266755 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-sgv8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:41Z\\\",\\\"message\\\":\\\"2026-03-18T18:03:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903\\\\n2026-03-18T18:03:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f686f941-369b-4afa-a80a-ae0c9865f903 to /host/opt/cni/bin/\\\\n2026-03-18T18:03:56Z [verbose] multus-daemon started\\\\n2026-03-18T18:03:56Z [verbose] Readiness Indicator file check\\\\n2026-03-18T18:04:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46tr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-sgv8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.284688 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.299744 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nxl6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2f0793b-3ae6-43d8-938e-f885d593d0a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56563e4d162a19b386a982dc9ff815542187023e2b13dae8cc45f1a0da742c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zkcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nxl6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.315463 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"322f1eea-395d-476c-a43b-c68071d0af20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4383aa3d0974750c79eeac83c24819e9f1420ee6053b580e63cb10f97a7ba15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ac5661f7e156252368b58d306f02007c3258fa2e1ceceabea1d24de2d1c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac768be559f620511681cc2a0cfea9c12167dc8f61c28366c04e1fcd8e933ab5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3190a151f36c6443aa029a4a6a37299f2ac87a8c546651e4edeef038aac8b1f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbe5089c8b98cbf56c2e3969419823ec57451318a23acae8495abdd00487207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b160663f7249cb799f891eb3c19a816777a7153b6ceda54287accad3ae587305\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239f01ee951559ab195f0b0fb924d8a06e9f6c98d9e4c82cdef4fc8874b90ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvmwq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l6h7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: E0318 18:04:54.320787 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.338749 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b105c010-f5cb-41ae-bdff-62bc05da91a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T18:04:45Z\\\",\\\"message\\\":\\\"handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:44Z is after 2025-08-24T17:21:41Z]\\\\nI0318 18:04:45.008700 7471 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\",\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:04:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-29hqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5278w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.350237 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b8t8h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dae087e-43c5-442e-98db-b815e8993c8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25a9b9242dd75f6d4c9654acaf1a32c5504d55f4beaaf209c815fd3d9c18537b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc54p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b8t8h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.370827 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.387281 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.403061 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1149928dedfd3e72479d1b6a0ecb11d1e7d9006bafa40aa2fa946ea87035d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0aea98a120c69dfc221eb15683942a1f076b1985c6cd83cdecfdef69efb18ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.415020 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c714de0b96098f4b9f2bba28c0cfd486b8ddd24a06cec98b4461bbf140d4d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.423032 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de73a23f-7b17-40f3-bb5d-14c8bff178b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705c9996158e9f84b6dfe3677cb1fa6e2a76368302ec405736ae684df9f52847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:03:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5mmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:03:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crzrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.440170 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff5b1f8d-21ca-4a18-952a-bbc202aeb521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf8bb2bb96fb119752477b001549e46c726bf4563b2ff3b21861162a92451077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c84704f28f3b426cf055f8f5f74d1eb4fbf33763d4a75ed403378ea84191f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbxck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vjsrq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.453647 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aa723d0-dca0-4422-9efb-d76ac76e7e81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1f5412a0ced04506c33cc27f64b88a604b9a494ed9e873cc518be10b7ff80d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f380eecd9d73fb036f0357035fd3079b8f20f1d3a2c77a529cf7ac4ac6a8b9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.487923 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3b0058b-3deb-4333-b16f-821b4a9c8629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7327ab71dfbe97c7ba17c66f22f069e6b99de27728c103c5860b48741aa0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573bd7891f63465cb964150f10c6fb6d23623e1b7704169994fea3bc873bf39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9f7c44c5f5eb7c52ed54555baa92ef9e5f32218aa3f9b239861c89422d0ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc78e9ef511f612893756e6bf9b8323ebf7bbbbeddd13785bf88e78e50fee697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5e25dc462adc8d6ce9af15e2d1c673a6fa32382018565ff6cf513c3a1e7157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0adf412bf74db646fd1dd6727dde3772f6b799f87e5e50a5a5b8bd4ed855b6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87a761c1a68ca026599390a7d7bbe2b1d46fcac1a79621651f309864ad6e6a1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd04fb28547c3da6f28d4481ab55cfd5452bb87597d043d54883f608afdcaa2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.508837 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:02:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T18:03:33Z\\\",\\\"message\\\":\\\"ed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1773857013\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1773857013\\\\\\\\\\\\\\\" (2026-03-18 17:03:33 +0000 UTC to 2027-03-18 17:03:33 +0000 UTC (now=2026-03-18 18:03:33.400442675 +0000 UTC))\\\\\\\"\\\\nI0318 18:03:33.400473 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0318 18:03:33.400490 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0318 18:03:33.400507 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400521 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0318 18:03:33.400542 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1535724668/tls.crt::/tmp/serving-cert-1535724668/tls.key\\\\\\\"\\\\nI0318 18:03:33.400694 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0318 18:03:33.400891 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400903 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0318 18:03:33.400922 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0318 18:03:33.400928 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0318 18:03:33.400998 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0318 18:03:33.401012 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0318 18:03:33.401495 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T18:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:02:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:02:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:02:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:02:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:54 crc kubenswrapper[5008]: I0318 18:04:54.529264 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n7tw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T18:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g2z9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:54Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:55 crc kubenswrapper[5008]: I0318 18:04:55.197852 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:55 crc kubenswrapper[5008]: E0318 18:04:55.198623 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.197701 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.197744 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.197702 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:56 crc kubenswrapper[5008]: E0318 18:04:56.197818 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:56 crc kubenswrapper[5008]: E0318 18:04:56.197914 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:56 crc kubenswrapper[5008]: E0318 18:04:56.198171 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.917629 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.917730 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.917755 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.917782 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.917804 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:56Z","lastTransitionTime":"2026-03-18T18:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:56 crc kubenswrapper[5008]: E0318 18:04:56.941359 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.947733 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.947790 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.947808 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.947833 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.947852 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:56Z","lastTransitionTime":"2026-03-18T18:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:56 crc kubenswrapper[5008]: E0318 18:04:56.968264 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.974111 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.974453 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.974707 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.974912 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:56 crc kubenswrapper[5008]: I0318 18:04:56.975056 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:56Z","lastTransitionTime":"2026-03-18T18:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:56 crc kubenswrapper[5008]: E0318 18:04:56.995816 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.001674 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.001748 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.001768 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.001794 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.001814 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:57Z","lastTransitionTime":"2026-03-18T18:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:57 crc kubenswrapper[5008]: E0318 18:04:57.024998 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.030931 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.031079 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.031099 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.031126 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.031147 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:04:57Z","lastTransitionTime":"2026-03-18T18:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:04:57 crc kubenswrapper[5008]: E0318 18:04:57.052252 5008 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T18:04:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8b8aa0da-2a30-4cfb-ae9e-b1bd69b061b3\\\",\\\"systemUUID\\\":\\\"85242208-ddaf-4ad1-b838-03a8e3bf165e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 18 18:04:57 crc kubenswrapper[5008]: E0318 18:04:57.052417 5008 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 18:04:57 crc kubenswrapper[5008]: I0318 18:04:57.198043 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:57 crc kubenswrapper[5008]: E0318 18:04:57.198212 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:58 crc kubenswrapper[5008]: I0318 18:04:58.197356 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:04:58 crc kubenswrapper[5008]: E0318 18:04:58.197524 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:04:58 crc kubenswrapper[5008]: I0318 18:04:58.197938 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:04:58 crc kubenswrapper[5008]: I0318 18:04:58.197951 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:04:58 crc kubenswrapper[5008]: I0318 18:04:58.199144 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:04:58 crc kubenswrapper[5008]: E0318 18:04:58.199393 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:04:58 crc kubenswrapper[5008]: E0318 18:04:58.199737 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:04:58 crc kubenswrapper[5008]: E0318 18:04:58.199974 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:04:59 crc kubenswrapper[5008]: I0318 18:04:59.197848 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:04:59 crc kubenswrapper[5008]: E0318 18:04:59.198132 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:04:59 crc kubenswrapper[5008]: E0318 18:04:59.322747 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:05:00 crc kubenswrapper[5008]: I0318 18:05:00.197722 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:00 crc kubenswrapper[5008]: I0318 18:05:00.197826 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:00 crc kubenswrapper[5008]: I0318 18:05:00.197854 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:00 crc kubenswrapper[5008]: E0318 18:05:00.197986 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:00 crc kubenswrapper[5008]: E0318 18:05:00.198292 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:00 crc kubenswrapper[5008]: E0318 18:05:00.198259 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:01 crc kubenswrapper[5008]: I0318 18:05:01.197853 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:01 crc kubenswrapper[5008]: E0318 18:05:01.198111 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:02 crc kubenswrapper[5008]: I0318 18:05:02.197455 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:02 crc kubenswrapper[5008]: I0318 18:05:02.197480 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:02 crc kubenswrapper[5008]: I0318 18:05:02.197681 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:02 crc kubenswrapper[5008]: E0318 18:05:02.197807 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:02 crc kubenswrapper[5008]: E0318 18:05:02.198131 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:02 crc kubenswrapper[5008]: E0318 18:05:02.198541 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:03 crc kubenswrapper[5008]: I0318 18:05:03.197466 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:03 crc kubenswrapper[5008]: E0318 18:05:03.197728 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.198411 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.198465 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:04 crc kubenswrapper[5008]: E0318 18:05:04.198579 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:04 crc kubenswrapper[5008]: E0318 18:05:04.198814 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.198945 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:04 crc kubenswrapper[5008]: E0318 18:05:04.199109 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.248955 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podStartSLOduration=109.248933766 podStartE2EDuration="1m49.248933766s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.248269088 +0000 UTC m=+160.767742187" watchObservedRunningTime="2026-03-18 18:05:04.248933766 +0000 UTC m=+160.768406855" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.273323 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vjsrq" podStartSLOduration=109.273300924 podStartE2EDuration="1m49.273300924s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.272958205 +0000 UTC m=+160.792431314" watchObservedRunningTime="2026-03-18 18:05:04.273300924 +0000 UTC m=+160.792774013" Mar 18 18:05:04 crc kubenswrapper[5008]: E0318 18:05:04.323653 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.402134 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=71.402098955 podStartE2EDuration="1m11.402098955s" podCreationTimestamp="2026-03-18 18:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.370047649 +0000 UTC m=+160.889520748" watchObservedRunningTime="2026-03-18 18:05:04.402098955 +0000 UTC m=+160.921572054" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.402986 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=82.402973079 podStartE2EDuration="1m22.402973079s" podCreationTimestamp="2026-03-18 18:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.399902996 +0000 UTC m=+160.919376125" watchObservedRunningTime="2026-03-18 18:05:04.402973079 +0000 UTC m=+160.922446178" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.439203 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.439173447 podStartE2EDuration="1m22.439173447s" podCreationTimestamp="2026-03-18 18:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.437755139 +0000 UTC m=+160.957228228" watchObservedRunningTime="2026-03-18 18:05:04.439173447 +0000 UTC m=+160.958646566" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.484842 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sgv8s" podStartSLOduration=109.484814241 podStartE2EDuration="1m49.484814241s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.484163893 +0000 UTC m=+161.003636972" watchObservedRunningTime="2026-03-18 18:05:04.484814241 +0000 UTC m=+161.004287320" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.500207 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=24.500173156 podStartE2EDuration="24.500173156s" podCreationTimestamp="2026-03-18 18:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.499784295 +0000 UTC m=+161.019257374" watchObservedRunningTime="2026-03-18 18:05:04.500173156 +0000 UTC m=+161.019646275" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.516496 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=32.516478117 podStartE2EDuration="32.516478117s" podCreationTimestamp="2026-03-18 18:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.516378414 +0000 UTC m=+161.035851553" watchObservedRunningTime="2026-03-18 18:05:04.516478117 +0000 UTC m=+161.035951236" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.622108 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b8t8h" podStartSLOduration=109.622083041 podStartE2EDuration="1m49.622083041s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.621205787 +0000 UTC m=+161.140678866" watchObservedRunningTime="2026-03-18 18:05:04.622083041 +0000 UTC m=+161.141556130" Mar 18 18:05:04 crc kubenswrapper[5008]: I0318 18:05:04.665157 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8nxl6" podStartSLOduration=110.665138174 podStartE2EDuration="1m50.665138174s" podCreationTimestamp="2026-03-18 18:03:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.649374308 +0000 UTC m=+161.168847387" watchObservedRunningTime="2026-03-18 18:05:04.665138174 +0000 UTC m=+161.184611263" Mar 18 18:05:05 crc kubenswrapper[5008]: I0318 18:05:05.197794 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:05 crc kubenswrapper[5008]: E0318 18:05:05.198018 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:06 crc kubenswrapper[5008]: I0318 18:05:06.197880 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:06 crc kubenswrapper[5008]: I0318 18:05:06.197951 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:06 crc kubenswrapper[5008]: I0318 18:05:06.197954 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:06 crc kubenswrapper[5008]: E0318 18:05:06.198184 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:06 crc kubenswrapper[5008]: E0318 18:05:06.198356 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:06 crc kubenswrapper[5008]: E0318 18:05:06.198447 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.198282 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:07 crc kubenswrapper[5008]: E0318 18:05:07.198831 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.360381 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.360467 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.360488 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.360514 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.360534 5008 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T18:05:07Z","lastTransitionTime":"2026-03-18T18:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.384464 5008 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.399412 5008 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.430432 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l6h7t" podStartSLOduration=112.430366916 podStartE2EDuration="1m52.430366916s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:04.666016008 +0000 UTC m=+161.185489097" watchObservedRunningTime="2026-03-18 18:05:07.430366916 +0000 UTC m=+163.949840025" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.430718 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n"] Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.431244 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.435658 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.435923 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.436188 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.436196 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.478752 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3e4f6f-155c-421b-bb25-d15c22af3365-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.478795 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b3e4f6f-155c-421b-bb25-d15c22af3365-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.478846 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9b3e4f6f-155c-421b-bb25-d15c22af3365-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.478937 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b3e4f6f-155c-421b-bb25-d15c22af3365-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.479016 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9b3e4f6f-155c-421b-bb25-d15c22af3365-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.580510 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9b3e4f6f-155c-421b-bb25-d15c22af3365-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.581036 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b3e4f6f-155c-421b-bb25-d15c22af3365-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.581283 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9b3e4f6f-155c-421b-bb25-d15c22af3365-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.581468 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3e4f6f-155c-421b-bb25-d15c22af3365-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.581703 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b3e4f6f-155c-421b-bb25-d15c22af3365-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.580722 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9b3e4f6f-155c-421b-bb25-d15c22af3365-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.583407 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b3e4f6f-155c-421b-bb25-d15c22af3365-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.581497 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9b3e4f6f-155c-421b-bb25-d15c22af3365-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.592225 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3e4f6f-155c-421b-bb25-d15c22af3365-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.611140 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b3e4f6f-155c-421b-bb25-d15c22af3365-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2tl9n\" (UID: \"9b3e4f6f-155c-421b-bb25-d15c22af3365\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:07 crc kubenswrapper[5008]: I0318 18:05:07.757397 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" Mar 18 18:05:08 crc kubenswrapper[5008]: I0318 18:05:08.198321 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:08 crc kubenswrapper[5008]: I0318 18:05:08.198452 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:08 crc kubenswrapper[5008]: I0318 18:05:08.198463 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:08 crc kubenswrapper[5008]: E0318 18:05:08.198650 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:08 crc kubenswrapper[5008]: E0318 18:05:08.199067 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:08 crc kubenswrapper[5008]: E0318 18:05:08.199376 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:08 crc kubenswrapper[5008]: I0318 18:05:08.529102 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" event={"ID":"9b3e4f6f-155c-421b-bb25-d15c22af3365","Type":"ContainerStarted","Data":"d9f72c1268428236c33b0a4594d62a1d5dd4a3fd1a3f26501858b5809ccbb45a"} Mar 18 18:05:08 crc kubenswrapper[5008]: I0318 18:05:08.529171 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" event={"ID":"9b3e4f6f-155c-421b-bb25-d15c22af3365","Type":"ContainerStarted","Data":"e9b09bf38df6938586aeae1800bcc80c3f0724e38faa7f0572e674bb025c85d9"} Mar 18 18:05:08 crc kubenswrapper[5008]: I0318 18:05:08.552863 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2tl9n" podStartSLOduration=113.552835702 podStartE2EDuration="1m53.552835702s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:08.551103076 +0000 UTC m=+165.070576195" watchObservedRunningTime="2026-03-18 18:05:08.552835702 +0000 UTC m=+165.072308821" Mar 18 18:05:09 crc kubenswrapper[5008]: I0318 18:05:09.199057 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:09 crc kubenswrapper[5008]: E0318 18:05:09.199469 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:09 crc kubenswrapper[5008]: E0318 18:05:09.325866 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:05:10 crc kubenswrapper[5008]: I0318 18:05:10.198241 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:10 crc kubenswrapper[5008]: I0318 18:05:10.198407 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:10 crc kubenswrapper[5008]: I0318 18:05:10.198276 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:10 crc kubenswrapper[5008]: E0318 18:05:10.198525 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:10 crc kubenswrapper[5008]: E0318 18:05:10.198907 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:10 crc kubenswrapper[5008]: E0318 18:05:10.199101 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:10 crc kubenswrapper[5008]: I0318 18:05:10.200671 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:05:10 crc kubenswrapper[5008]: E0318 18:05:10.201215 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:05:11 crc kubenswrapper[5008]: I0318 18:05:11.197891 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:11 crc kubenswrapper[5008]: E0318 18:05:11.198105 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:11 crc kubenswrapper[5008]: I0318 18:05:11.234723 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:11 crc kubenswrapper[5008]: E0318 18:05:11.234954 5008 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:05:11 crc kubenswrapper[5008]: E0318 18:05:11.235058 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs podName:1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7 nodeName:}" failed. No retries permitted until 2026-03-18 18:06:15.235033722 +0000 UTC m=+231.754506801 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs") pod "network-metrics-daemon-g2z9p" (UID: "1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 18:05:12 crc kubenswrapper[5008]: I0318 18:05:12.198000 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:12 crc kubenswrapper[5008]: I0318 18:05:12.198050 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:12 crc kubenswrapper[5008]: I0318 18:05:12.198112 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:12 crc kubenswrapper[5008]: E0318 18:05:12.198234 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:12 crc kubenswrapper[5008]: E0318 18:05:12.198517 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:12 crc kubenswrapper[5008]: E0318 18:05:12.198767 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:13 crc kubenswrapper[5008]: I0318 18:05:13.198274 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:13 crc kubenswrapper[5008]: E0318 18:05:13.198500 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:14 crc kubenswrapper[5008]: I0318 18:05:14.198114 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:14 crc kubenswrapper[5008]: I0318 18:05:14.198170 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:14 crc kubenswrapper[5008]: I0318 18:05:14.198400 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:14 crc kubenswrapper[5008]: E0318 18:05:14.200175 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:14 crc kubenswrapper[5008]: E0318 18:05:14.200297 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:14 crc kubenswrapper[5008]: E0318 18:05:14.200428 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:14 crc kubenswrapper[5008]: E0318 18:05:14.327343 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:05:15 crc kubenswrapper[5008]: I0318 18:05:15.197170 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:15 crc kubenswrapper[5008]: E0318 18:05:15.197354 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:16 crc kubenswrapper[5008]: I0318 18:05:16.197975 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:16 crc kubenswrapper[5008]: I0318 18:05:16.198028 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:16 crc kubenswrapper[5008]: E0318 18:05:16.198179 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:16 crc kubenswrapper[5008]: E0318 18:05:16.198450 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:16 crc kubenswrapper[5008]: I0318 18:05:16.198600 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:16 crc kubenswrapper[5008]: E0318 18:05:16.198731 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:17 crc kubenswrapper[5008]: I0318 18:05:17.198188 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:17 crc kubenswrapper[5008]: E0318 18:05:17.198728 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:18 crc kubenswrapper[5008]: I0318 18:05:18.197981 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:18 crc kubenswrapper[5008]: I0318 18:05:18.198013 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:18 crc kubenswrapper[5008]: E0318 18:05:18.198208 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:18 crc kubenswrapper[5008]: I0318 18:05:18.198266 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:18 crc kubenswrapper[5008]: E0318 18:05:18.198350 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:18 crc kubenswrapper[5008]: E0318 18:05:18.198645 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:19 crc kubenswrapper[5008]: I0318 18:05:19.198089 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:19 crc kubenswrapper[5008]: E0318 18:05:19.198331 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:19 crc kubenswrapper[5008]: E0318 18:05:19.328204 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:05:20 crc kubenswrapper[5008]: I0318 18:05:20.198005 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:20 crc kubenswrapper[5008]: I0318 18:05:20.198087 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:20 crc kubenswrapper[5008]: E0318 18:05:20.198243 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:20 crc kubenswrapper[5008]: I0318 18:05:20.198350 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:20 crc kubenswrapper[5008]: E0318 18:05:20.198630 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:20 crc kubenswrapper[5008]: E0318 18:05:20.198724 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:21 crc kubenswrapper[5008]: I0318 18:05:21.198107 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:21 crc kubenswrapper[5008]: E0318 18:05:21.198297 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:22 crc kubenswrapper[5008]: I0318 18:05:22.198116 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:22 crc kubenswrapper[5008]: I0318 18:05:22.198120 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:22 crc kubenswrapper[5008]: E0318 18:05:22.198283 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:22 crc kubenswrapper[5008]: I0318 18:05:22.198132 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:22 crc kubenswrapper[5008]: E0318 18:05:22.198483 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:22 crc kubenswrapper[5008]: E0318 18:05:22.198697 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:23 crc kubenswrapper[5008]: I0318 18:05:23.197409 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:23 crc kubenswrapper[5008]: E0318 18:05:23.197660 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:23 crc kubenswrapper[5008]: I0318 18:05:23.198948 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:05:23 crc kubenswrapper[5008]: E0318 18:05:23.199163 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5278w_openshift-ovn-kubernetes(b105c010-f5cb-41ae-bdff-62bc05da91a1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" Mar 18 18:05:24 crc kubenswrapper[5008]: I0318 18:05:24.197410 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:24 crc kubenswrapper[5008]: I0318 18:05:24.197490 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:24 crc kubenswrapper[5008]: I0318 18:05:24.197496 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:24 crc kubenswrapper[5008]: E0318 18:05:24.198608 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:24 crc kubenswrapper[5008]: E0318 18:05:24.198828 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:24 crc kubenswrapper[5008]: E0318 18:05:24.199147 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:24 crc kubenswrapper[5008]: E0318 18:05:24.328947 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:05:25 crc kubenswrapper[5008]: I0318 18:05:25.197616 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:25 crc kubenswrapper[5008]: E0318 18:05:25.197826 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:26 crc kubenswrapper[5008]: I0318 18:05:26.197870 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:26 crc kubenswrapper[5008]: I0318 18:05:26.197956 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:26 crc kubenswrapper[5008]: E0318 18:05:26.198066 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:26 crc kubenswrapper[5008]: E0318 18:05:26.198275 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:26 crc kubenswrapper[5008]: I0318 18:05:26.198757 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:26 crc kubenswrapper[5008]: E0318 18:05:26.199025 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:27 crc kubenswrapper[5008]: I0318 18:05:27.198074 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:27 crc kubenswrapper[5008]: E0318 18:05:27.198286 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:27 crc kubenswrapper[5008]: I0318 18:05:27.602473 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/1.log" Mar 18 18:05:27 crc kubenswrapper[5008]: I0318 18:05:27.603108 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/0.log" Mar 18 18:05:27 crc kubenswrapper[5008]: I0318 18:05:27.603176 5008 generic.go:334] "Generic (PLEG): container finished" podID="9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd" containerID="49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b" exitCode=1 Mar 18 18:05:27 crc kubenswrapper[5008]: I0318 18:05:27.603220 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgv8s" event={"ID":"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd","Type":"ContainerDied","Data":"49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b"} Mar 18 18:05:27 crc kubenswrapper[5008]: I0318 18:05:27.603266 5008 scope.go:117] "RemoveContainer" containerID="4cc3436d47104a689857992e527fa89ed59179a50a1f3c92bed8186c807937a9" Mar 18 18:05:27 crc kubenswrapper[5008]: I0318 18:05:27.604472 5008 scope.go:117] "RemoveContainer" containerID="49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b" Mar 18 18:05:27 crc kubenswrapper[5008]: E0318 18:05:27.604798 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-sgv8s_openshift-multus(9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd)\"" pod="openshift-multus/multus-sgv8s" podUID="9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd" Mar 18 18:05:28 crc kubenswrapper[5008]: I0318 18:05:28.198295 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:28 crc kubenswrapper[5008]: I0318 18:05:28.198310 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:28 crc kubenswrapper[5008]: E0318 18:05:28.198454 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:28 crc kubenswrapper[5008]: I0318 18:05:28.198517 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:28 crc kubenswrapper[5008]: E0318 18:05:28.198685 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:28 crc kubenswrapper[5008]: E0318 18:05:28.198982 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:28 crc kubenswrapper[5008]: I0318 18:05:28.609772 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/1.log" Mar 18 18:05:29 crc kubenswrapper[5008]: I0318 18:05:29.197427 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:29 crc kubenswrapper[5008]: E0318 18:05:29.197667 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:29 crc kubenswrapper[5008]: E0318 18:05:29.330505 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:05:30 crc kubenswrapper[5008]: I0318 18:05:30.197957 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:30 crc kubenswrapper[5008]: I0318 18:05:30.198004 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:30 crc kubenswrapper[5008]: I0318 18:05:30.197963 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:30 crc kubenswrapper[5008]: E0318 18:05:30.198159 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:30 crc kubenswrapper[5008]: E0318 18:05:30.198422 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:30 crc kubenswrapper[5008]: E0318 18:05:30.198548 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:31 crc kubenswrapper[5008]: I0318 18:05:31.197471 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:31 crc kubenswrapper[5008]: E0318 18:05:31.197964 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:32 crc kubenswrapper[5008]: I0318 18:05:32.197360 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:32 crc kubenswrapper[5008]: I0318 18:05:32.197411 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:32 crc kubenswrapper[5008]: E0318 18:05:32.197539 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:32 crc kubenswrapper[5008]: I0318 18:05:32.197835 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:32 crc kubenswrapper[5008]: E0318 18:05:32.197940 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:32 crc kubenswrapper[5008]: E0318 18:05:32.198045 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:33 crc kubenswrapper[5008]: I0318 18:05:33.198176 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:33 crc kubenswrapper[5008]: E0318 18:05:33.198520 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:34 crc kubenswrapper[5008]: I0318 18:05:34.197207 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:34 crc kubenswrapper[5008]: I0318 18:05:34.197307 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:34 crc kubenswrapper[5008]: E0318 18:05:34.199434 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:34 crc kubenswrapper[5008]: I0318 18:05:34.199461 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:34 crc kubenswrapper[5008]: E0318 18:05:34.199505 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:34 crc kubenswrapper[5008]: E0318 18:05:34.199673 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:34 crc kubenswrapper[5008]: E0318 18:05:34.331816 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:05:35 crc kubenswrapper[5008]: I0318 18:05:35.198093 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:35 crc kubenswrapper[5008]: E0318 18:05:35.198331 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:36 crc kubenswrapper[5008]: I0318 18:05:36.197893 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:36 crc kubenswrapper[5008]: I0318 18:05:36.197928 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:36 crc kubenswrapper[5008]: I0318 18:05:36.197892 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:36 crc kubenswrapper[5008]: E0318 18:05:36.198055 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:36 crc kubenswrapper[5008]: E0318 18:05:36.198148 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:36 crc kubenswrapper[5008]: E0318 18:05:36.198257 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:37 crc kubenswrapper[5008]: I0318 18:05:37.197945 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:37 crc kubenswrapper[5008]: E0318 18:05:37.198386 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:37 crc kubenswrapper[5008]: I0318 18:05:37.199467 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:05:37 crc kubenswrapper[5008]: I0318 18:05:37.644254 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/3.log" Mar 18 18:05:37 crc kubenswrapper[5008]: I0318 18:05:37.651704 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerStarted","Data":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} Mar 18 18:05:37 crc kubenswrapper[5008]: I0318 18:05:37.654172 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:05:37 crc kubenswrapper[5008]: I0318 18:05:37.691639 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podStartSLOduration=142.691620949 podStartE2EDuration="2m22.691620949s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:37.690500518 +0000 UTC m=+194.209973607" watchObservedRunningTime="2026-03-18 18:05:37.691620949 +0000 UTC m=+194.211094028" Mar 18 18:05:38 crc kubenswrapper[5008]: I0318 18:05:38.110871 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g2z9p"] Mar 18 18:05:38 crc kubenswrapper[5008]: I0318 18:05:38.111001 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:38 crc kubenswrapper[5008]: E0318 18:05:38.111133 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:38 crc kubenswrapper[5008]: I0318 18:05:38.198406 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:38 crc kubenswrapper[5008]: I0318 18:05:38.198446 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:38 crc kubenswrapper[5008]: I0318 18:05:38.198406 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:38 crc kubenswrapper[5008]: E0318 18:05:38.198620 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:38 crc kubenswrapper[5008]: E0318 18:05:38.198765 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:38 crc kubenswrapper[5008]: E0318 18:05:38.198879 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:39 crc kubenswrapper[5008]: I0318 18:05:39.198283 5008 scope.go:117] "RemoveContainer" containerID="49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b" Mar 18 18:05:39 crc kubenswrapper[5008]: E0318 18:05:39.333171 5008 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:05:39 crc kubenswrapper[5008]: I0318 18:05:39.659609 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/1.log" Mar 18 18:05:39 crc kubenswrapper[5008]: I0318 18:05:39.659707 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgv8s" event={"ID":"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd","Type":"ContainerStarted","Data":"9ed6a37c676aa71b3c726690ca391ca75c10ba6ed602041cdcebcfdc7e15ea9e"} Mar 18 18:05:40 crc kubenswrapper[5008]: I0318 18:05:40.198115 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:40 crc kubenswrapper[5008]: I0318 18:05:40.198217 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:40 crc kubenswrapper[5008]: I0318 18:05:40.198217 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:40 crc kubenswrapper[5008]: E0318 18:05:40.198350 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:40 crc kubenswrapper[5008]: I0318 18:05:40.198381 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:40 crc kubenswrapper[5008]: E0318 18:05:40.198613 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:40 crc kubenswrapper[5008]: E0318 18:05:40.198797 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:40 crc kubenswrapper[5008]: E0318 18:05:40.198891 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:42 crc kubenswrapper[5008]: I0318 18:05:42.198238 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:42 crc kubenswrapper[5008]: I0318 18:05:42.198361 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:42 crc kubenswrapper[5008]: I0318 18:05:42.198263 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:42 crc kubenswrapper[5008]: I0318 18:05:42.198449 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:42 crc kubenswrapper[5008]: E0318 18:05:42.198595 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:42 crc kubenswrapper[5008]: E0318 18:05:42.198870 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:42 crc kubenswrapper[5008]: E0318 18:05:42.198891 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:42 crc kubenswrapper[5008]: E0318 18:05:42.198963 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:44 crc kubenswrapper[5008]: I0318 18:05:44.110788 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:44 crc kubenswrapper[5008]: I0318 18:05:44.110921 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111014 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:07:46.110932797 +0000 UTC m=+322.630405916 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:44 crc kubenswrapper[5008]: I0318 18:05:44.111074 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111126 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:05:44 crc kubenswrapper[5008]: I0318 18:05:44.111136 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:44 crc kubenswrapper[5008]: I0318 18:05:44.111211 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111149 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111262 5008 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111293 5008 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111318 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:07:46.111302058 +0000 UTC m=+322.630775167 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111212 5008 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111395 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 18:07:46.11136869 +0000 UTC m=+322.630841779 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111456 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 18:07:46.111415061 +0000 UTC m=+322.630888210 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111543 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111611 5008 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111635 5008 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.111709 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 18:07:46.111689489 +0000 UTC m=+322.631162718 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 18:05:44 crc kubenswrapper[5008]: I0318 18:05:44.197240 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.199737 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 18:05:44 crc kubenswrapper[5008]: I0318 18:05:44.199839 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:44 crc kubenswrapper[5008]: I0318 18:05:44.199890 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:44 crc kubenswrapper[5008]: I0318 18:05:44.199907 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.200055 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.200174 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 18:05:44 crc kubenswrapper[5008]: E0318 18:05:44.200293 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g2z9p" podUID="1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.197597 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.197638 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.197737 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.197908 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.199961 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.201252 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.201289 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.201478 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.201545 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 18:05:46 crc kubenswrapper[5008]: I0318 18:05:46.201727 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.439697 5008 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.491164 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.491858 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.492094 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.492933 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.494982 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.496399 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.500728 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.501155 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.501505 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.502377 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.502798 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.503135 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.503650 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.503813 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.508010 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bprvc"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.508662 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.513228 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.513387 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.513393 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.513813 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.513841 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.513874 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.513454 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.514098 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.514659 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.514771 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.515038 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.515642 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.517536 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-n6cmd"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.521750 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d7v25"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.524328 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.525430 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.541913 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.542413 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.542667 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.542867 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-s5pml"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.543366 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-s5pml" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.543767 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-gmczr"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.544224 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.544712 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.544948 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.545043 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.545276 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.545678 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.547521 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8jq26"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.548206 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qnsxl"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.548758 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.549413 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.549603 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.549979 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.553541 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.554396 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.555022 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.555334 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.555963 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562035 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-trusted-ca\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562079 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9nh8\" (UniqueName: \"kubernetes.io/projected/fd727bd7-0dd3-44a6-90da-e63c81fb6194-kube-api-access-s9nh8\") pod \"cluster-samples-operator-665b6dd947-c5wlw\" (UID: \"fd727bd7-0dd3-44a6-90da-e63c81fb6194\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562104 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d56a0c1-b18d-4e8c-acff-e57f000b3744-node-pullsecrets\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562124 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5stq\" (UniqueName: \"kubernetes.io/projected/225c4962-d9d2-4d32-85de-51872521d9a3-kube-api-access-s5stq\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562145 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpnfp\" (UniqueName: \"kubernetes.io/projected/1d56a0c1-b18d-4e8c-acff-e57f000b3744-kube-api-access-vpnfp\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562166 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225c4962-d9d2-4d32-85de-51872521d9a3-serving-cert\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562185 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57e1d26b-c119-48dc-8f78-35168b785d47-metrics-tls\") pod \"dns-operator-744455d44c-d7v25\" (UID: \"57e1d26b-c119-48dc-8f78-35168b785d47\") " pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562206 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d56a0c1-b18d-4e8c-acff-e57f000b3744-serving-cert\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562226 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7856ce5-83fa-4265-84e5-73d635bcbd17-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2g6md\" (UID: \"e7856ce5-83fa-4265-84e5-73d635bcbd17\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562246 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7856ce5-83fa-4265-84e5-73d635bcbd17-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2g6md\" (UID: \"e7856ce5-83fa-4265-84e5-73d635bcbd17\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.556993 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.557188 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-z9ssp"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.557116 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.560425 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.560736 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.560862 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562670 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5gw26"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.560918 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.561000 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562881 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.562962 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.563022 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.563291 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.563740 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.563785 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.564094 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.564124 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.564280 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.564607 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.564846 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.565198 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-etcd-serving-ca\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.565320 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.565613 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.565887 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.565945 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd727bd7-0dd3-44a6-90da-e63c81fb6194-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c5wlw\" (UID: \"fd727bd7-0dd3-44a6-90da-e63c81fb6194\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566003 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-client-ca\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566037 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-serving-cert\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566083 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjgxx\" (UniqueName: \"kubernetes.io/projected/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-kube-api-access-cjgxx\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566123 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-config\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566157 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566159 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d56a0c1-b18d-4e8c-acff-e57f000b3744-encryption-config\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566188 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566246 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-config\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566288 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-config\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566309 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d56a0c1-b18d-4e8c-acff-e57f000b3744-etcd-client\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566326 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t9v5\" (UniqueName: \"kubernetes.io/projected/57e1d26b-c119-48dc-8f78-35168b785d47-kube-api-access-8t9v5\") pod \"dns-operator-744455d44c-d7v25\" (UID: \"57e1d26b-c119-48dc-8f78-35168b785d47\") " pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566344 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkjm8\" (UniqueName: \"kubernetes.io/projected/e7856ce5-83fa-4265-84e5-73d635bcbd17-kube-api-access-rkjm8\") pod \"openshift-apiserver-operator-796bbdcf4f-2g6md\" (UID: \"e7856ce5-83fa-4265-84e5-73d635bcbd17\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566368 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d56a0c1-b18d-4e8c-acff-e57f000b3744-audit-dir\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566388 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-audit\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.566426 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-image-import-ca\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.567075 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.571152 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.571436 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.571615 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.571793 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.572252 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.572345 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tvxdw"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.572714 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.573066 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.573260 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.573618 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.573888 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.574320 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-s5m7q"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.574759 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.577585 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.577786 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.577938 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.578444 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.578832 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.579035 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vnxkn"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.579145 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.580271 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.580771 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.580985 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.595119 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.597043 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.598952 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.601426 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.601479 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.601662 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.602282 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.605192 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.613497 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.613837 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.614545 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.614818 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.614990 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.615331 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.615585 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.615770 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.616002 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.616225 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.616473 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.616676 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.616783 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.616892 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.616995 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.617087 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.617196 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.617445 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.617508 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.617546 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.617740 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.617820 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.618134 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.619853 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.620544 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.620772 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.621206 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.622156 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.622555 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.622658 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.622806 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.623026 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.623162 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.623280 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.625541 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.626240 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k965r"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.626713 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.627135 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.627256 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.627492 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.627603 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.627967 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.628679 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.629420 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.629808 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.630183 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.630240 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.632611 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.635400 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.636319 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.636450 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.638992 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.639119 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.639719 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4nz7f"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.640370 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.642547 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.643452 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.643807 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.644385 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.644798 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.646539 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.647067 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.649637 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564284-6tsrt"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.650172 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.650579 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.650784 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.651259 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.651379 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tfg8c"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.652182 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.652939 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.653734 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.654243 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.656005 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.658678 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.663711 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d7v25"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667124 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667173 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-oauth-config\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667213 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-config\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667237 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-config\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667261 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-serving-cert\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667281 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d56a0c1-b18d-4e8c-acff-e57f000b3744-etcd-client\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667301 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t9v5\" (UniqueName: \"kubernetes.io/projected/57e1d26b-c119-48dc-8f78-35168b785d47-kube-api-access-8t9v5\") pod \"dns-operator-744455d44c-d7v25\" (UID: \"57e1d26b-c119-48dc-8f78-35168b785d47\") " pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667321 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3639ff6-daf7-495e-a7c7-b687a2bb9262-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667342 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0c14eea6-708c-4f37-a1e1-67ae91804b9d-srv-cert\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667362 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf037f0-6468-45ec-a599-49652456a53f-serving-cert\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667387 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkjm8\" (UniqueName: \"kubernetes.io/projected/e7856ce5-83fa-4265-84e5-73d635bcbd17-kube-api-access-rkjm8\") pod \"openshift-apiserver-operator-796bbdcf4f-2g6md\" (UID: \"e7856ce5-83fa-4265-84e5-73d635bcbd17\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667409 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d56a0c1-b18d-4e8c-acff-e57f000b3744-audit-dir\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667431 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/66c96a01-fca4-48c5-bb58-c5e954cdf1a2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9zlx9\" (UID: \"66c96a01-fca4-48c5-bb58-c5e954cdf1a2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667451 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ebf037f0-6468-45ec-a599-49652456a53f-etcd-ca\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667470 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-serving-cert\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667490 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-audit\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667513 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-image-import-ca\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667571 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-service-ca-bundle\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667596 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-config\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667614 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-service-ca\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667637 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3639ff6-daf7-495e-a7c7-b687a2bb9262-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667658 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-images\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667676 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf037f0-6468-45ec-a599-49652456a53f-config\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667698 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qd4\" (UniqueName: \"kubernetes.io/projected/66c96a01-fca4-48c5-bb58-c5e954cdf1a2-kube-api-access-46qd4\") pod \"olm-operator-6b444d44fb-9zlx9\" (UID: \"66c96a01-fca4-48c5-bb58-c5e954cdf1a2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667723 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg44b\" (UniqueName: \"kubernetes.io/projected/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-kube-api-access-gg44b\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667747 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667778 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-trusted-ca\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667797 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-trusted-ca-bundle\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667822 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nh8\" (UniqueName: \"kubernetes.io/projected/fd727bd7-0dd3-44a6-90da-e63c81fb6194-kube-api-access-s9nh8\") pod \"cluster-samples-operator-665b6dd947-c5wlw\" (UID: \"fd727bd7-0dd3-44a6-90da-e63c81fb6194\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667844 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj44k\" (UniqueName: \"kubernetes.io/projected/0c14eea6-708c-4f37-a1e1-67ae91804b9d-kube-api-access-qj44k\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667871 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vmbl\" (UniqueName: \"kubernetes.io/projected/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-kube-api-access-7vmbl\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667894 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d56a0c1-b18d-4e8c-acff-e57f000b3744-node-pullsecrets\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667912 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5stq\" (UniqueName: \"kubernetes.io/projected/225c4962-d9d2-4d32-85de-51872521d9a3-kube-api-access-s5stq\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667927 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpnfp\" (UniqueName: \"kubernetes.io/projected/1d56a0c1-b18d-4e8c-acff-e57f000b3744-kube-api-access-vpnfp\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667943 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ebf037f0-6468-45ec-a599-49652456a53f-etcd-client\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667959 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225c4962-d9d2-4d32-85de-51872521d9a3-serving-cert\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667974 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qczbs\" (UniqueName: \"kubernetes.io/projected/c3639ff6-daf7-495e-a7c7-b687a2bb9262-kube-api-access-qczbs\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.667989 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ebf037f0-6468-45ec-a599-49652456a53f-etcd-service-ca\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668006 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57e1d26b-c119-48dc-8f78-35168b785d47-metrics-tls\") pod \"dns-operator-744455d44c-d7v25\" (UID: \"57e1d26b-c119-48dc-8f78-35168b785d47\") " pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668020 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvl6j\" (UniqueName: \"kubernetes.io/projected/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-kube-api-access-zvl6j\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668038 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668047 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-config\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668072 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d56a0c1-b18d-4e8c-acff-e57f000b3744-serving-cert\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668113 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7856ce5-83fa-4265-84e5-73d635bcbd17-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2g6md\" (UID: \"e7856ce5-83fa-4265-84e5-73d635bcbd17\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668144 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7856ce5-83fa-4265-84e5-73d635bcbd17-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2g6md\" (UID: \"e7856ce5-83fa-4265-84e5-73d635bcbd17\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668175 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxvxc\" (UniqueName: \"kubernetes.io/projected/31a94b93-89d6-4fab-87d7-05ecd80f55ec-kube-api-access-jxvxc\") pod \"downloads-7954f5f757-s5pml\" (UID: \"31a94b93-89d6-4fab-87d7-05ecd80f55ec\") " pod="openshift-console/downloads-7954f5f757-s5pml" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668208 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-etcd-serving-ca\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668250 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd727bd7-0dd3-44a6-90da-e63c81fb6194-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c5wlw\" (UID: \"fd727bd7-0dd3-44a6-90da-e63c81fb6194\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668272 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/66c96a01-fca4-48c5-bb58-c5e954cdf1a2-srv-cert\") pod \"olm-operator-6b444d44fb-9zlx9\" (UID: \"66c96a01-fca4-48c5-bb58-c5e954cdf1a2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668293 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3639ff6-daf7-495e-a7c7-b687a2bb9262-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668283 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668888 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7856ce5-83fa-4265-84e5-73d635bcbd17-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2g6md\" (UID: \"e7856ce5-83fa-4265-84e5-73d635bcbd17\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668312 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4j5w\" (UniqueName: \"kubernetes.io/projected/ebf037f0-6468-45ec-a599-49652456a53f-kube-api-access-k4j5w\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668923 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d56a0c1-b18d-4e8c-acff-e57f000b3744-audit-dir\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.668964 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-client-ca\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669001 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0c14eea6-708c-4f37-a1e1-67ae91804b9d-profile-collector-cert\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669028 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-serving-cert\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669058 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjgxx\" (UniqueName: \"kubernetes.io/projected/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-kube-api-access-cjgxx\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669080 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-config\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669098 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-proxy-tls\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669113 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-config\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669129 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-oauth-serving-cert\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669147 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d56a0c1-b18d-4e8c-acff-e57f000b3744-encryption-config\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669343 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-audit\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669789 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-config\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.669946 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bprvc"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.670105 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-image-import-ca\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.670790 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-client-ca\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.671782 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-trusted-ca\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.672011 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d56a0c1-b18d-4e8c-acff-e57f000b3744-node-pullsecrets\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.673095 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d56a0c1-b18d-4e8c-acff-e57f000b3744-encryption-config\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.673379 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-config\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.673724 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-n6cmd"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.673780 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d56a0c1-b18d-4e8c-acff-e57f000b3744-serving-cert\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.674260 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d56a0c1-b18d-4e8c-acff-e57f000b3744-etcd-client\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.674706 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225c4962-d9d2-4d32-85de-51872521d9a3-serving-cert\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.674926 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7856ce5-83fa-4265-84e5-73d635bcbd17-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2g6md\" (UID: \"e7856ce5-83fa-4265-84e5-73d635bcbd17\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.675119 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d56a0c1-b18d-4e8c-acff-e57f000b3744-etcd-serving-ca\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.675351 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.675882 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-s5pml"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.676055 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd727bd7-0dd3-44a6-90da-e63c81fb6194-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c5wlw\" (UID: \"fd727bd7-0dd3-44a6-90da-e63c81fb6194\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.677810 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57e1d26b-c119-48dc-8f78-35168b785d47-metrics-tls\") pod \"dns-operator-744455d44c-d7v25\" (UID: \"57e1d26b-c119-48dc-8f78-35168b785d47\") " pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.681615 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gmczr"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.686322 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5bfwn"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.687161 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5bfwn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.689186 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8jq26"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.691758 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.692836 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-serving-cert\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.693996 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.694042 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.695360 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-s5m7q"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.696448 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.697467 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.701573 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.702606 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4nz7f"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.703590 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qnsxl"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.707651 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.707713 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.709650 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.712303 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5gw26"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.712754 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-z9ssp"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.714674 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.715369 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.715847 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.717838 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k965r"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.717879 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564284-6tsrt"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.719139 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vnxkn"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.720683 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.721670 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.723238 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.723264 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.724500 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.725704 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.726274 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dx69r"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.727178 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.727509 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.728969 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.729940 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.730851 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5bfwn"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.731901 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.733130 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tfg8c"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.733710 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.733925 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2w6x4"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.735393 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ltp2s"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.735958 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ltp2s"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.736033 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.736245 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.736942 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2w6x4"] Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.762671 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.770729 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-service-ca-bundle\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.770759 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-config\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.770776 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-service-ca\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.770795 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3639ff6-daf7-495e-a7c7-b687a2bb9262-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.770813 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-images\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.770829 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf037f0-6468-45ec-a599-49652456a53f-config\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.770846 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg44b\" (UniqueName: \"kubernetes.io/projected/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-kube-api-access-gg44b\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.770945 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qd4\" (UniqueName: \"kubernetes.io/projected/66c96a01-fca4-48c5-bb58-c5e954cdf1a2-kube-api-access-46qd4\") pod \"olm-operator-6b444d44fb-9zlx9\" (UID: \"66c96a01-fca4-48c5-bb58-c5e954cdf1a2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.770967 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.771743 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-trusted-ca-bundle\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.771490 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf037f0-6468-45ec-a599-49652456a53f-config\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.771701 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.772360 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-service-ca\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.772668 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-trusted-ca-bundle\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.772888 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj44k\" (UniqueName: \"kubernetes.io/projected/0c14eea6-708c-4f37-a1e1-67ae91804b9d-kube-api-access-qj44k\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.772957 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vmbl\" (UniqueName: \"kubernetes.io/projected/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-kube-api-access-7vmbl\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773105 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ebf037f0-6468-45ec-a599-49652456a53f-etcd-client\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773139 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qczbs\" (UniqueName: \"kubernetes.io/projected/c3639ff6-daf7-495e-a7c7-b687a2bb9262-kube-api-access-qczbs\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773580 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ebf037f0-6468-45ec-a599-49652456a53f-etcd-service-ca\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773521 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773610 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvl6j\" (UniqueName: \"kubernetes.io/projected/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-kube-api-access-zvl6j\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773632 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773651 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxvxc\" (UniqueName: \"kubernetes.io/projected/31a94b93-89d6-4fab-87d7-05ecd80f55ec-kube-api-access-jxvxc\") pod \"downloads-7954f5f757-s5pml\" (UID: \"31a94b93-89d6-4fab-87d7-05ecd80f55ec\") " pod="openshift-console/downloads-7954f5f757-s5pml" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773668 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/66c96a01-fca4-48c5-bb58-c5e954cdf1a2-srv-cert\") pod \"olm-operator-6b444d44fb-9zlx9\" (UID: \"66c96a01-fca4-48c5-bb58-c5e954cdf1a2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773684 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3639ff6-daf7-495e-a7c7-b687a2bb9262-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773739 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4j5w\" (UniqueName: \"kubernetes.io/projected/ebf037f0-6468-45ec-a599-49652456a53f-kube-api-access-k4j5w\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773756 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0c14eea6-708c-4f37-a1e1-67ae91804b9d-profile-collector-cert\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773781 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-proxy-tls\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773796 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-config\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773810 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-oauth-serving-cert\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773835 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-oauth-config\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773858 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-serving-cert\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773881 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3639ff6-daf7-495e-a7c7-b687a2bb9262-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773897 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0c14eea6-708c-4f37-a1e1-67ae91804b9d-srv-cert\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773911 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf037f0-6468-45ec-a599-49652456a53f-serving-cert\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773933 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/66c96a01-fca4-48c5-bb58-c5e954cdf1a2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9zlx9\" (UID: \"66c96a01-fca4-48c5-bb58-c5e954cdf1a2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773947 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-serving-cert\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.773961 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ebf037f0-6468-45ec-a599-49652456a53f-etcd-ca\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.774244 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ebf037f0-6468-45ec-a599-49652456a53f-etcd-service-ca\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.774414 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ebf037f0-6468-45ec-a599-49652456a53f-etcd-ca\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.778418 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-config\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.778608 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-oauth-serving-cert\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.778933 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-oauth-config\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.779373 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf037f0-6468-45ec-a599-49652456a53f-serving-cert\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.779386 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ebf037f0-6468-45ec-a599-49652456a53f-etcd-client\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.779664 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3639ff6-daf7-495e-a7c7-b687a2bb9262-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.784225 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-serving-cert\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.794182 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.814183 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.834254 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.854735 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.873738 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.894275 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.914363 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.934449 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.953945 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.974530 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 18:05:48 crc kubenswrapper[5008]: I0318 18:05:48.994499 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.014208 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.034471 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.055392 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.082216 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.093912 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.115331 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.134640 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.146630 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3639ff6-daf7-495e-a7c7-b687a2bb9262-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.154444 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.175977 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.195530 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.214382 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.235810 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.254862 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.275462 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.293972 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.299114 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-serving-cert\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.314694 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.322917 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-config\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.343660 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.346363 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.353886 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.362204 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-service-ca-bundle\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.373645 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.394515 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.414966 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.428816 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/66c96a01-fca4-48c5-bb58-c5e954cdf1a2-srv-cert\") pod \"olm-operator-6b444d44fb-9zlx9\" (UID: \"66c96a01-fca4-48c5-bb58-c5e954cdf1a2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.434371 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.454407 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.473764 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.478118 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/66c96a01-fca4-48c5-bb58-c5e954cdf1a2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9zlx9\" (UID: \"66c96a01-fca4-48c5-bb58-c5e954cdf1a2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.478602 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0c14eea6-708c-4f37-a1e1-67ae91804b9d-profile-collector-cert\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.515448 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.522440 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-images\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.535344 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.555652 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.568975 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-proxy-tls\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.574001 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.593462 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.615943 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.641997 5008 request.go:700] Waited for 1.014333406s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.643648 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.654028 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.675280 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.694898 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.714221 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.734297 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.754356 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.773710 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 18:05:49 crc kubenswrapper[5008]: E0318 18:05:49.774483 5008 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 18:05:49 crc kubenswrapper[5008]: E0318 18:05:49.774601 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c14eea6-708c-4f37-a1e1-67ae91804b9d-srv-cert podName:0c14eea6-708c-4f37-a1e1-67ae91804b9d nodeName:}" failed. No retries permitted until 2026-03-18 18:05:50.274575079 +0000 UTC m=+206.794048158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/0c14eea6-708c-4f37-a1e1-67ae91804b9d-srv-cert") pod "catalog-operator-68c6474976-tpw24" (UID: "0c14eea6-708c-4f37-a1e1-67ae91804b9d") : failed to sync secret cache: timed out waiting for the condition Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.794119 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.815500 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.834148 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.854318 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.874211 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.894786 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.914537 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.933787 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.954650 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 18:05:49 crc kubenswrapper[5008]: I0318 18:05:49.994915 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.015018 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.035404 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.054450 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.074537 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.095633 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.114630 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.134851 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.155185 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.174945 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.193726 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.215096 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.233641 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.255029 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.274142 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.291514 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0c14eea6-708c-4f37-a1e1-67ae91804b9d-srv-cert\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.296286 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0c14eea6-708c-4f37-a1e1-67ae91804b9d-srv-cert\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.301322 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.314137 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.333637 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.370515 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t9v5\" (UniqueName: \"kubernetes.io/projected/57e1d26b-c119-48dc-8f78-35168b785d47-kube-api-access-8t9v5\") pod \"dns-operator-744455d44c-d7v25\" (UID: \"57e1d26b-c119-48dc-8f78-35168b785d47\") " pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.402019 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.404678 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkjm8\" (UniqueName: \"kubernetes.io/projected/e7856ce5-83fa-4265-84e5-73d635bcbd17-kube-api-access-rkjm8\") pod \"openshift-apiserver-operator-796bbdcf4f-2g6md\" (UID: \"e7856ce5-83fa-4265-84e5-73d635bcbd17\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.416339 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5stq\" (UniqueName: \"kubernetes.io/projected/225c4962-d9d2-4d32-85de-51872521d9a3-kube-api-access-s5stq\") pod \"route-controller-manager-6576b87f9c-dp77z\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.429373 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjgxx\" (UniqueName: \"kubernetes.io/projected/5cfb606e-0b94-4fef-b4c9-92cd528eab5c-kube-api-access-cjgxx\") pod \"console-operator-58897d9998-n6cmd\" (UID: \"5cfb606e-0b94-4fef-b4c9-92cd528eab5c\") " pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.437943 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.452983 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpnfp\" (UniqueName: \"kubernetes.io/projected/1d56a0c1-b18d-4e8c-acff-e57f000b3744-kube-api-access-vpnfp\") pod \"apiserver-76f77b778f-bprvc\" (UID: \"1d56a0c1-b18d-4e8c-acff-e57f000b3744\") " pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.469179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9nh8\" (UniqueName: \"kubernetes.io/projected/fd727bd7-0dd3-44a6-90da-e63c81fb6194-kube-api-access-s9nh8\") pod \"cluster-samples-operator-665b6dd947-c5wlw\" (UID: \"fd727bd7-0dd3-44a6-90da-e63c81fb6194\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.474598 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.495869 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.515169 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.534715 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.554854 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.576007 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.595940 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.614396 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.628369 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.634533 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.649849 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.652710 5008 request.go:700] Waited for 1.91629506s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.654838 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.662610 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.677019 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.679372 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.694838 5008 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.697469 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d7v25"] Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.714403 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.763170 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-n6cmd"] Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.779738 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg44b\" (UniqueName: \"kubernetes.io/projected/102c9d92-61ef-4147-aaf2-7a9e6a1fbfae-kube-api-access-gg44b\") pod \"authentication-operator-69f744f599-vnxkn\" (UID: \"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.779815 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qd4\" (UniqueName: \"kubernetes.io/projected/66c96a01-fca4-48c5-bb58-c5e954cdf1a2-kube-api-access-46qd4\") pod \"olm-operator-6b444d44fb-9zlx9\" (UID: \"66c96a01-fca4-48c5-bb58-c5e954cdf1a2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:50 crc kubenswrapper[5008]: W0318 18:05:50.793732 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57e1d26b_c119_48dc_8f78_35168b785d47.slice/crio-2579deae379dcf0fde093f1f2032ac039e26296ef9a7272b94213e98b43769ef WatchSource:0}: Error finding container 2579deae379dcf0fde093f1f2032ac039e26296ef9a7272b94213e98b43769ef: Status 404 returned error can't find the container with id 2579deae379dcf0fde093f1f2032ac039e26296ef9a7272b94213e98b43769ef Mar 18 18:05:50 crc kubenswrapper[5008]: W0318 18:05:50.802303 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cfb606e_0b94_4fef_b4c9_92cd528eab5c.slice/crio-d89fdf1d459ea6bb17d574dda0592ff4b9732baae22ad7f51ad244028d509514 WatchSource:0}: Error finding container d89fdf1d459ea6bb17d574dda0592ff4b9732baae22ad7f51ad244028d509514: Status 404 returned error can't find the container with id d89fdf1d459ea6bb17d574dda0592ff4b9732baae22ad7f51ad244028d509514 Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.808713 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj44k\" (UniqueName: \"kubernetes.io/projected/0c14eea6-708c-4f37-a1e1-67ae91804b9d-kube-api-access-qj44k\") pod \"catalog-operator-68c6474976-tpw24\" (UID: \"0c14eea6-708c-4f37-a1e1-67ae91804b9d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.809595 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vmbl\" (UniqueName: \"kubernetes.io/projected/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-kube-api-access-7vmbl\") pod \"console-f9d7485db-gmczr\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.847547 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxvxc\" (UniqueName: \"kubernetes.io/projected/31a94b93-89d6-4fab-87d7-05ecd80f55ec-kube-api-access-jxvxc\") pod \"downloads-7954f5f757-s5pml\" (UID: \"31a94b93-89d6-4fab-87d7-05ecd80f55ec\") " pod="openshift-console/downloads-7954f5f757-s5pml" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.859586 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z"] Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.861916 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qczbs\" (UniqueName: \"kubernetes.io/projected/c3639ff6-daf7-495e-a7c7-b687a2bb9262-kube-api-access-qczbs\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.870470 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvl6j\" (UniqueName: \"kubernetes.io/projected/2d627459-6ea9-460c-8f9e-1fe47bcc59e1-kube-api-access-zvl6j\") pod \"machine-config-operator-74547568cd-fbqw7\" (UID: \"2d627459-6ea9-460c-8f9e-1fe47bcc59e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.880906 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw"] Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.887706 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3639ff6-daf7-495e-a7c7-b687a2bb9262-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wq5x6\" (UID: \"c3639ff6-daf7-495e-a7c7-b687a2bb9262\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.915241 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4j5w\" (UniqueName: \"kubernetes.io/projected/ebf037f0-6468-45ec-a599-49652456a53f-kube-api-access-k4j5w\") pod \"etcd-operator-b45778765-8jq26\" (UID: \"ebf037f0-6468-45ec-a599-49652456a53f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.924297 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md"] Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.947629 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bprvc"] Mar 18 18:05:50 crc kubenswrapper[5008]: W0318 18:05:50.949545 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod225c4962_d9d2_4d32_85de_51872521d9a3.slice/crio-d14784e67b968db0179cb4f6a40a3f0b6a70dc0140ac8993cbd6f427176d925c WatchSource:0}: Error finding container d14784e67b968db0179cb4f6a40a3f0b6a70dc0140ac8993cbd6f427176d925c: Status 404 returned error can't find the container with id d14784e67b968db0179cb4f6a40a3f0b6a70dc0140ac8993cbd6f427176d925c Mar 18 18:05:50 crc kubenswrapper[5008]: W0318 18:05:50.952297 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7856ce5_83fa_4265_84e5_73d635bcbd17.slice/crio-6eae81784bfc8b1fbbbbf9783d0b71784a013c5ae69d8a91b168f69eadae0daa WatchSource:0}: Error finding container 6eae81784bfc8b1fbbbbf9783d0b71784a013c5ae69d8a91b168f69eadae0daa: Status 404 returned error can't find the container with id 6eae81784bfc8b1fbbbbf9783d0b71784a013c5ae69d8a91b168f69eadae0daa Mar 18 18:05:50 crc kubenswrapper[5008]: W0318 18:05:50.955718 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d56a0c1_b18d_4e8c_acff_e57f000b3744.slice/crio-e9ba9976daa8ae0a2bec885f32c417fc4d9951b4fb7adae1917001509d78427d WatchSource:0}: Error finding container e9ba9976daa8ae0a2bec885f32c417fc4d9951b4fb7adae1917001509d78427d: Status 404 returned error can't find the container with id e9ba9976daa8ae0a2bec885f32c417fc4d9951b4fb7adae1917001509d78427d Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.961263 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.974653 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.981733 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:50 crc kubenswrapper[5008]: I0318 18:05:50.988948 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.000533 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxlb4\" (UniqueName: \"kubernetes.io/projected/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-kube-api-access-pxlb4\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.000589 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h589z\" (UniqueName: \"kubernetes.io/projected/119099cb-bd81-40db-8393-0ada3cbc7619-kube-api-access-h589z\") pod \"multus-admission-controller-857f4d67dd-k965r\" (UID: \"119099cb-bd81-40db-8393-0ada3cbc7619\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.000620 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.000646 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b10cc94-4769-4c11-a94d-7b01d8f228b1-stats-auth\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.000703 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm9gh\" (UniqueName: \"kubernetes.io/projected/d5bc82db-9313-414e-aa86-ff630456fb49-kube-api-access-wm9gh\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.000778 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.000797 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgsgw\" (UniqueName: \"kubernetes.io/projected/2c293129-16e9-4469-8307-f11ea13cd329-kube-api-access-lgsgw\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.001196 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-serving-cert\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.001230 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-dir\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.001274 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c28c9a0-de21-4c01-a5d6-1f6490878a0b-config\") pod \"kube-apiserver-operator-766d6c64bb-sg7qz\" (UID: \"4c28c9a0-de21-4c01-a5d6-1f6490878a0b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.001303 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-policies\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.001377 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-bound-sa-token\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.002742 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb1f930b-6743-4222-900e-c2442f33be13-serving-cert\") pod \"openshift-config-operator-7777fb866f-l9dpq\" (UID: \"cb1f930b-6743-4222-900e-c2442f33be13\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.002793 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/48236e40-cae1-46af-aa88-f5e038cc1a42-machine-approver-tls\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.002815 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjr6v\" (UniqueName: \"kubernetes.io/projected/48236e40-cae1-46af-aa88-f5e038cc1a42-kube-api-access-cjr6v\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.002838 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7ngx\" (UniqueName: \"kubernetes.io/projected/7b10cc94-4769-4c11-a94d-7b01d8f228b1-kube-api-access-m7ngx\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.002879 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-audit-policies\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003023 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003052 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22751667-e260-4216-b78b-cc49c0ff5b5a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bwf7f\" (UID: \"22751667-e260-4216-b78b-cc49c0ff5b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003078 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/137dc523-385f-4afb-b972-66093e2e071e-config\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003108 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c28c9a0-de21-4c01-a5d6-1f6490878a0b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sg7qz\" (UID: \"4c28c9a0-de21-4c01-a5d6-1f6490878a0b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003134 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqxf\" (UniqueName: \"kubernetes.io/projected/21073be2-2012-4c88-9797-92b12b7ef7db-kube-api-access-ppqxf\") pod \"service-ca-operator-777779d784-6ksxt\" (UID: \"21073be2-2012-4c88-9797-92b12b7ef7db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003178 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003249 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-trusted-ca\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003350 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-audit-dir\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003392 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/467ea6f3-9b84-4075-a6e2-0adbd72b6ddb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jc24d\" (UID: \"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.003476 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:51.50346043 +0000 UTC m=+208.022933599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003576 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c293129-16e9-4469-8307-f11ea13cd329-trusted-ca\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003624 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48236e40-cae1-46af-aa88-f5e038cc1a42-config\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003641 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhz2l\" (UniqueName: \"kubernetes.io/projected/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-kube-api-access-fhz2l\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003678 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22751667-e260-4216-b78b-cc49c0ff5b5a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bwf7f\" (UID: \"22751667-e260-4216-b78b-cc49c0ff5b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003696 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21073be2-2012-4c88-9797-92b12b7ef7db-config\") pod \"service-ca-operator-777779d784-6ksxt\" (UID: \"21073be2-2012-4c88-9797-92b12b7ef7db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003710 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-encryption-config\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003729 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5bc82db-9313-414e-aa86-ff630456fb49-serving-cert\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003766 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdpgn\" (UniqueName: \"kubernetes.io/projected/cb1f930b-6743-4222-900e-c2442f33be13-kube-api-access-zdpgn\") pod \"openshift-config-operator-7777fb866f-l9dpq\" (UID: \"cb1f930b-6743-4222-900e-c2442f33be13\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003793 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgmcw\" (UniqueName: \"kubernetes.io/projected/5f5483f1-06dc-4cf0-8fec-2e5f4e16e459-kube-api-access-bgmcw\") pod \"service-ca-9c57cc56f-4nz7f\" (UID: \"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459\") " pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003814 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/467ea6f3-9b84-4075-a6e2-0adbd72b6ddb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jc24d\" (UID: \"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003847 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003879 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003945 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c8f2\" (UniqueName: \"kubernetes.io/projected/860a9876-b8f6-4125-bd1c-51518eb10283-kube-api-access-2c8f2\") pod \"collect-profiles-29564280-d46pl\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.003974 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh7gk\" (UniqueName: \"kubernetes.io/projected/082fdb2c-88b4-42c3-8eeb-1817c0177198-kube-api-access-qh7gk\") pod \"openshift-controller-manager-operator-756b6f6bc6-hd8hn\" (UID: \"082fdb2c-88b4-42c3-8eeb-1817c0177198\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.004051 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21073be2-2012-4c88-9797-92b12b7ef7db-serving-cert\") pod \"service-ca-operator-777779d784-6ksxt\" (UID: \"21073be2-2012-4c88-9797-92b12b7ef7db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.004116 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-config\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.005712 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b10cc94-4769-4c11-a94d-7b01d8f228b1-service-ca-bundle\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006580 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006618 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006642 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860a9876-b8f6-4125-bd1c-51518eb10283-config-volume\") pod \"collect-profiles-29564280-d46pl\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006671 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27hsn\" (UniqueName: \"kubernetes.io/projected/35531061-50ee-4972-bec4-75190551fbbe-kube-api-access-27hsn\") pod \"machine-config-controller-84d6567774-ckkdv\" (UID: \"35531061-50ee-4972-bec4-75190551fbbe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006700 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006730 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-tls\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006777 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d02d52ba-4ba4-47b2-b0f3-a769e009d161-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006803 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c293129-16e9-4469-8307-f11ea13cd329-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006831 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c28c9a0-de21-4c01-a5d6-1f6490878a0b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sg7qz\" (UID: \"4c28c9a0-de21-4c01-a5d6-1f6490878a0b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006850 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-client-ca\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006873 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b10cc94-4769-4c11-a94d-7b01d8f228b1-default-certificate\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006897 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c293129-16e9-4469-8307-f11ea13cd329-metrics-tls\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.006995 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007025 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082fdb2c-88b4-42c3-8eeb-1817c0177198-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-hd8hn\" (UID: \"082fdb2c-88b4-42c3-8eeb-1817c0177198\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007075 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/35531061-50ee-4972-bec4-75190551fbbe-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ckkdv\" (UID: \"35531061-50ee-4972-bec4-75190551fbbe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007095 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgdnf\" (UniqueName: \"kubernetes.io/projected/22751667-e260-4216-b78b-cc49c0ff5b5a-kube-api-access-mgdnf\") pod \"kube-storage-version-migrator-operator-b67b599dd-bwf7f\" (UID: \"22751667-e260-4216-b78b-cc49c0ff5b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007114 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/082fdb2c-88b4-42c3-8eeb-1817c0177198-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-hd8hn\" (UID: \"082fdb2c-88b4-42c3-8eeb-1817c0177198\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007147 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/137dc523-385f-4afb-b972-66093e2e071e-images\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007167 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d02d52ba-4ba4-47b2-b0f3-a769e009d161-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007185 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg8nb\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-kube-api-access-wg8nb\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007204 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cb1f930b-6743-4222-900e-c2442f33be13-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l9dpq\" (UID: \"cb1f930b-6743-4222-900e-c2442f33be13\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007224 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5f5483f1-06dc-4cf0-8fec-2e5f4e16e459-signing-key\") pod \"service-ca-9c57cc56f-4nz7f\" (UID: \"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459\") " pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007247 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860a9876-b8f6-4125-bd1c-51518eb10283-secret-volume\") pod \"collect-profiles-29564280-d46pl\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007270 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35531061-50ee-4972-bec4-75190551fbbe-proxy-tls\") pod \"machine-config-controller-84d6567774-ckkdv\" (UID: \"35531061-50ee-4972-bec4-75190551fbbe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007289 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjvr\" (UniqueName: \"kubernetes.io/projected/87181d7a-94d9-4918-99d6-0fa95896bc05-kube-api-access-lfjvr\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptkj5\" (UID: \"87181d7a-94d9-4918-99d6-0fa95896bc05\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007514 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9gd4\" (UniqueName: \"kubernetes.io/projected/137dc523-385f-4afb-b972-66093e2e071e-kube-api-access-m9gd4\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007547 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-certificates\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007593 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b10cc94-4769-4c11-a94d-7b01d8f228b1-metrics-certs\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007613 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/87181d7a-94d9-4918-99d6-0fa95896bc05-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptkj5\" (UID: \"87181d7a-94d9-4918-99d6-0fa95896bc05\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007637 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007661 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007701 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-etcd-client\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007738 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007764 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007783 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007805 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/119099cb-bd81-40db-8393-0ada3cbc7619-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k965r\" (UID: \"119099cb-bd81-40db-8393-0ada3cbc7619\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007831 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5f5483f1-06dc-4cf0-8fec-2e5f4e16e459-signing-cabundle\") pod \"service-ca-9c57cc56f-4nz7f\" (UID: \"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459\") " pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007854 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/48236e40-cae1-46af-aa88-f5e038cc1a42-auth-proxy-config\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007877 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/137dc523-385f-4afb-b972-66093e2e071e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.007900 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/467ea6f3-9b84-4075-a6e2-0adbd72b6ddb-config\") pod \"kube-controller-manager-operator-78b949d7b-jc24d\" (UID: \"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.021777 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.053548 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-s5pml" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.059128 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109260 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.109442 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:51.609413238 +0000 UTC m=+208.128886317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109736 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-config\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109757 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b10cc94-4769-4c11-a94d-7b01d8f228b1-service-ca-bundle\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109777 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2204982-5f64-473e-809f-17a61cf942d8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-64lb7\" (UID: \"a2204982-5f64-473e-809f-17a61cf942d8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109832 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109849 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109866 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860a9876-b8f6-4125-bd1c-51518eb10283-config-volume\") pod \"collect-profiles-29564280-d46pl\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109897 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ldt9\" (UniqueName: \"kubernetes.io/projected/9630528a-7c8c-46cd-8bf4-e116f35b6911-kube-api-access-9ldt9\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109922 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27hsn\" (UniqueName: \"kubernetes.io/projected/35531061-50ee-4972-bec4-75190551fbbe-kube-api-access-27hsn\") pod \"machine-config-controller-84d6567774-ckkdv\" (UID: \"35531061-50ee-4972-bec4-75190551fbbe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109939 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.109990 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-tls\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110008 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d02d52ba-4ba4-47b2-b0f3-a769e009d161-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110040 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-plugins-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110058 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9630528a-7c8c-46cd-8bf4-e116f35b6911-apiservice-cert\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110092 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c293129-16e9-4469-8307-f11ea13cd329-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110127 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b4612145-a0ad-4f87-b1e8-9f17248900bd-node-bootstrap-token\") pod \"machine-config-server-dx69r\" (UID: \"b4612145-a0ad-4f87-b1e8-9f17248900bd\") " pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110146 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c28c9a0-de21-4c01-a5d6-1f6490878a0b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sg7qz\" (UID: \"4c28c9a0-de21-4c01-a5d6-1f6490878a0b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110165 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-client-ca\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110198 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2204982-5f64-473e-809f-17a61cf942d8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-64lb7\" (UID: \"a2204982-5f64-473e-809f-17a61cf942d8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110243 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9hp8\" (UniqueName: \"kubernetes.io/projected/ade0f201-c317-40fc-bf82-293a53853ec4-kube-api-access-n9hp8\") pod \"ingress-canary-5bfwn\" (UID: \"ade0f201-c317-40fc-bf82-293a53853ec4\") " pod="openshift-ingress-canary/ingress-canary-5bfwn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110287 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-socket-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110309 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b10cc94-4769-4c11-a94d-7b01d8f228b1-default-certificate\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110328 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c293129-16e9-4469-8307-f11ea13cd329-metrics-tls\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110365 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8qgm\" (UniqueName: \"kubernetes.io/projected/eb6e9847-508d-42d9-b429-2567547e41fe-kube-api-access-c8qgm\") pod \"dns-default-ltp2s\" (UID: \"eb6e9847-508d-42d9-b429-2567547e41fe\") " pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110391 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110410 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082fdb2c-88b4-42c3-8eeb-1817c0177198-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-hd8hn\" (UID: \"082fdb2c-88b4-42c3-8eeb-1817c0177198\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110453 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/35531061-50ee-4972-bec4-75190551fbbe-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ckkdv\" (UID: \"35531061-50ee-4972-bec4-75190551fbbe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110479 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgdnf\" (UniqueName: \"kubernetes.io/projected/22751667-e260-4216-b78b-cc49c0ff5b5a-kube-api-access-mgdnf\") pod \"kube-storage-version-migrator-operator-b67b599dd-bwf7f\" (UID: \"22751667-e260-4216-b78b-cc49c0ff5b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110494 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/082fdb2c-88b4-42c3-8eeb-1817c0177198-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-hd8hn\" (UID: \"082fdb2c-88b4-42c3-8eeb-1817c0177198\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110537 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/137dc523-385f-4afb-b972-66093e2e071e-images\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110581 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d02d52ba-4ba4-47b2-b0f3-a769e009d161-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110596 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg8nb\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-kube-api-access-wg8nb\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110611 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cb1f930b-6743-4222-900e-c2442f33be13-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l9dpq\" (UID: \"cb1f930b-6743-4222-900e-c2442f33be13\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110628 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5f5483f1-06dc-4cf0-8fec-2e5f4e16e459-signing-key\") pod \"service-ca-9c57cc56f-4nz7f\" (UID: \"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459\") " pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110662 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-csi-data-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110686 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860a9876-b8f6-4125-bd1c-51518eb10283-secret-volume\") pod \"collect-profiles-29564280-d46pl\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110680 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d02d52ba-4ba4-47b2-b0f3-a769e009d161-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110706 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35531061-50ee-4972-bec4-75190551fbbe-proxy-tls\") pod \"machine-config-controller-84d6567774-ckkdv\" (UID: \"35531061-50ee-4972-bec4-75190551fbbe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110743 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjvr\" (UniqueName: \"kubernetes.io/projected/87181d7a-94d9-4918-99d6-0fa95896bc05-kube-api-access-lfjvr\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptkj5\" (UID: \"87181d7a-94d9-4918-99d6-0fa95896bc05\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110828 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9gd4\" (UniqueName: \"kubernetes.io/projected/137dc523-385f-4afb-b972-66093e2e071e-kube-api-access-m9gd4\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110853 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrqk\" (UniqueName: \"kubernetes.io/projected/e2c07d05-3ff0-41e8-b792-7b349f553049-kube-api-access-ngrqk\") pod \"package-server-manager-789f6589d5-9fjbs\" (UID: \"e2c07d05-3ff0-41e8-b792-7b349f553049\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110920 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-certificates\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110967 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b10cc94-4769-4c11-a94d-7b01d8f228b1-service-ca-bundle\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110962 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9630528a-7c8c-46cd-8bf4-e116f35b6911-tmpfs\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.111476 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860a9876-b8f6-4125-bd1c-51518eb10283-config-volume\") pod \"collect-profiles-29564280-d46pl\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.110711 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.118458 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.118990 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-config\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.119059 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082fdb2c-88b4-42c3-8eeb-1817c0177198-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-hd8hn\" (UID: \"082fdb2c-88b4-42c3-8eeb-1817c0177198\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.120394 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-client-ca\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.121040 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.121339 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.121354 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/137dc523-385f-4afb-b972-66093e2e071e-images\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.121915 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cb1f930b-6743-4222-900e-c2442f33be13-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l9dpq\" (UID: \"cb1f930b-6743-4222-900e-c2442f33be13\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.122375 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c07d05-3ff0-41e8-b792-7b349f553049-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9fjbs\" (UID: \"e2c07d05-3ff0-41e8-b792-7b349f553049\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.122624 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b10cc94-4769-4c11-a94d-7b01d8f228b1-metrics-certs\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.122848 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/87181d7a-94d9-4918-99d6-0fa95896bc05-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptkj5\" (UID: \"87181d7a-94d9-4918-99d6-0fa95896bc05\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.123166 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-tls\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.123998 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.124091 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.124170 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/082fdb2c-88b4-42c3-8eeb-1817c0177198-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-hd8hn\" (UID: \"082fdb2c-88b4-42c3-8eeb-1817c0177198\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.124201 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-etcd-client\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.124283 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tfg8c\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.124379 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.124535 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.124672 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/35531061-50ee-4972-bec4-75190551fbbe-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ckkdv\" (UID: \"35531061-50ee-4972-bec4-75190551fbbe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125121 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/119099cb-bd81-40db-8393-0ada3cbc7619-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k965r\" (UID: \"119099cb-bd81-40db-8393-0ada3cbc7619\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125166 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125249 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxcdl\" (UniqueName: \"kubernetes.io/projected/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-kube-api-access-mxcdl\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125315 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb6e9847-508d-42d9-b429-2567547e41fe-config-volume\") pod \"dns-default-ltp2s\" (UID: \"eb6e9847-508d-42d9-b429-2567547e41fe\") " pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125721 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5f5483f1-06dc-4cf0-8fec-2e5f4e16e459-signing-cabundle\") pod \"service-ca-9c57cc56f-4nz7f\" (UID: \"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459\") " pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125745 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/48236e40-cae1-46af-aa88-f5e038cc1a42-auth-proxy-config\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125816 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/137dc523-385f-4afb-b972-66093e2e071e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125885 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/467ea6f3-9b84-4075-a6e2-0adbd72b6ddb-config\") pod \"kube-controller-manager-operator-78b949d7b-jc24d\" (UID: \"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125911 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdhdd\" (UniqueName: \"kubernetes.io/projected/951315e5-4217-41ba-8126-073fffe96d80-kube-api-access-tdhdd\") pod \"migrator-59844c95c7-crqfs\" (UID: \"951315e5-4217-41ba-8126-073fffe96d80\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.125999 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlb4\" (UniqueName: \"kubernetes.io/projected/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-kube-api-access-pxlb4\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.126074 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h589z\" (UniqueName: \"kubernetes.io/projected/119099cb-bd81-40db-8393-0ada3cbc7619-kube-api-access-h589z\") pod \"multus-admission-controller-857f4d67dd-k965r\" (UID: \"119099cb-bd81-40db-8393-0ada3cbc7619\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.126170 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.126190 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b10cc94-4769-4c11-a94d-7b01d8f228b1-stats-auth\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.126262 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm9gh\" (UniqueName: \"kubernetes.io/projected/d5bc82db-9313-414e-aa86-ff630456fb49-kube-api-access-wm9gh\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.126327 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9630528a-7c8c-46cd-8bf4-e116f35b6911-webhook-cert\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.126864 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.127126 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgsgw\" (UniqueName: \"kubernetes.io/projected/2c293129-16e9-4469-8307-f11ea13cd329-kube-api-access-lgsgw\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.127243 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-serving-cert\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.127322 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-dir\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.127410 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c28c9a0-de21-4c01-a5d6-1f6490878a0b-config\") pod \"kube-apiserver-operator-766d6c64bb-sg7qz\" (UID: \"4c28c9a0-de21-4c01-a5d6-1f6490878a0b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.127438 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-policies\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.127502 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb1f930b-6743-4222-900e-c2442f33be13-serving-cert\") pod \"openshift-config-operator-7777fb866f-l9dpq\" (UID: \"cb1f930b-6743-4222-900e-c2442f33be13\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.129250 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5f5483f1-06dc-4cf0-8fec-2e5f4e16e459-signing-cabundle\") pod \"service-ca-9c57cc56f-4nz7f\" (UID: \"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459\") " pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.131231 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.131907 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.134296 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.134452 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/48236e40-cae1-46af-aa88-f5e038cc1a42-auth-proxy-config\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.134628 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-certificates\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.134944 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.137110 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/467ea6f3-9b84-4075-a6e2-0adbd72b6ddb-config\") pod \"kube-controller-manager-operator-78b949d7b-jc24d\" (UID: \"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.167358 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c293129-16e9-4469-8307-f11ea13cd329-metrics-tls\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.167489 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.167727 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/119099cb-bd81-40db-8393-0ada3cbc7619-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k965r\" (UID: \"119099cb-bd81-40db-8393-0ada3cbc7619\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.168096 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b10cc94-4769-4c11-a94d-7b01d8f228b1-metrics-certs\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.171396 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35531061-50ee-4972-bec4-75190551fbbe-proxy-tls\") pod \"machine-config-controller-84d6567774-ckkdv\" (UID: \"35531061-50ee-4972-bec4-75190551fbbe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.171521 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.171778 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/137dc523-385f-4afb-b972-66093e2e071e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.172495 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c28c9a0-de21-4c01-a5d6-1f6490878a0b-config\") pod \"kube-apiserver-operator-766d6c64bb-sg7qz\" (UID: \"4c28c9a0-de21-4c01-a5d6-1f6490878a0b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.173565 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-bound-sa-token\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.173795 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/48236e40-cae1-46af-aa88-f5e038cc1a42-machine-approver-tls\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.173816 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjr6v\" (UniqueName: \"kubernetes.io/projected/48236e40-cae1-46af-aa88-f5e038cc1a42-kube-api-access-cjr6v\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.174253 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d02d52ba-4ba4-47b2-b0f3-a769e009d161-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.174498 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7ngx\" (UniqueName: \"kubernetes.io/projected/7b10cc94-4769-4c11-a94d-7b01d8f228b1-kube-api-access-m7ngx\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.174878 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-dir\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.175401 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860a9876-b8f6-4125-bd1c-51518eb10283-secret-volume\") pod \"collect-profiles-29564280-d46pl\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.175442 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tfg8c\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.175569 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.175691 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-audit-policies\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.175750 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-mountpoint-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.176598 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.176653 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22751667-e260-4216-b78b-cc49c0ff5b5a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bwf7f\" (UID: \"22751667-e260-4216-b78b-cc49c0ff5b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.176671 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/137dc523-385f-4afb-b972-66093e2e071e-config\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.176688 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqxf\" (UniqueName: \"kubernetes.io/projected/21073be2-2012-4c88-9797-92b12b7ef7db-kube-api-access-ppqxf\") pod \"service-ca-operator-777779d784-6ksxt\" (UID: \"21073be2-2012-4c88-9797-92b12b7ef7db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.177615 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/87181d7a-94d9-4918-99d6-0fa95896bc05-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptkj5\" (UID: \"87181d7a-94d9-4918-99d6-0fa95896bc05\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.177799 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-policies\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.178466 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c28c9a0-de21-4c01-a5d6-1f6490878a0b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sg7qz\" (UID: \"4c28c9a0-de21-4c01-a5d6-1f6490878a0b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.178629 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.178689 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-trusted-ca\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.178718 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-audit-dir\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.178835 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/467ea6f3-9b84-4075-a6e2-0adbd72b6ddb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jc24d\" (UID: \"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.178872 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkgqk\" (UniqueName: \"kubernetes.io/projected/51d04574-0631-403a-8bf5-4127787463d7-kube-api-access-pkgqk\") pod \"auto-csr-approver-29564284-6tsrt\" (UID: \"51d04574-0631-403a-8bf5-4127787463d7\") " pod="openshift-infra/auto-csr-approver-29564284-6tsrt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.178903 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-audit-dir\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.179377 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:51.679360607 +0000 UTC m=+208.198833686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.179895 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/137dc523-385f-4afb-b972-66093e2e071e-config\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.180214 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-trusted-ca\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.180261 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c293129-16e9-4469-8307-f11ea13cd329-trusted-ca\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.180301 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b4612145-a0ad-4f87-b1e8-9f17248900bd-certs\") pod \"machine-config-server-dx69r\" (UID: \"b4612145-a0ad-4f87-b1e8-9f17248900bd\") " pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.180834 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48236e40-cae1-46af-aa88-f5e038cc1a42-config\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.180883 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhz2l\" (UniqueName: \"kubernetes.io/projected/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-kube-api-access-fhz2l\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.180903 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22751667-e260-4216-b78b-cc49c0ff5b5a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bwf7f\" (UID: \"22751667-e260-4216-b78b-cc49c0ff5b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.180942 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21073be2-2012-4c88-9797-92b12b7ef7db-config\") pod \"service-ca-operator-777779d784-6ksxt\" (UID: \"21073be2-2012-4c88-9797-92b12b7ef7db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.180960 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-encryption-config\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.180975 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5bc82db-9313-414e-aa86-ff630456fb49-serving-cert\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.181047 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdpgn\" (UniqueName: \"kubernetes.io/projected/cb1f930b-6743-4222-900e-c2442f33be13-kube-api-access-zdpgn\") pod \"openshift-config-operator-7777fb866f-l9dpq\" (UID: \"cb1f930b-6743-4222-900e-c2442f33be13\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.181070 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgmcw\" (UniqueName: \"kubernetes.io/projected/5f5483f1-06dc-4cf0-8fec-2e5f4e16e459-kube-api-access-bgmcw\") pod \"service-ca-9c57cc56f-4nz7f\" (UID: \"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459\") " pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.181087 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/467ea6f3-9b84-4075-a6e2-0adbd72b6ddb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jc24d\" (UID: \"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.181106 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs5lk\" (UniqueName: \"kubernetes.io/projected/b4612145-a0ad-4f87-b1e8-9f17248900bd-kube-api-access-rs5lk\") pod \"machine-config-server-dx69r\" (UID: \"b4612145-a0ad-4f87-b1e8-9f17248900bd\") " pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184349 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184425 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184500 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ade0f201-c317-40fc-bf82-293a53853ec4-cert\") pod \"ingress-canary-5bfwn\" (UID: \"ade0f201-c317-40fc-bf82-293a53853ec4\") " pod="openshift-ingress-canary/ingress-canary-5bfwn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184549 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb6e9847-508d-42d9-b429-2567547e41fe-metrics-tls\") pod \"dns-default-ltp2s\" (UID: \"eb6e9847-508d-42d9-b429-2567547e41fe\") " pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184686 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c8f2\" (UniqueName: \"kubernetes.io/projected/860a9876-b8f6-4125-bd1c-51518eb10283-kube-api-access-2c8f2\") pod \"collect-profiles-29564280-d46pl\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184723 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh7gk\" (UniqueName: \"kubernetes.io/projected/082fdb2c-88b4-42c3-8eeb-1817c0177198-kube-api-access-qh7gk\") pod \"openshift-controller-manager-operator-756b6f6bc6-hd8hn\" (UID: \"082fdb2c-88b4-42c3-8eeb-1817c0177198\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184792 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2204982-5f64-473e-809f-17a61cf942d8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-64lb7\" (UID: \"a2204982-5f64-473e-809f-17a61cf942d8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184827 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21073be2-2012-4c88-9797-92b12b7ef7db-serving-cert\") pod \"service-ca-operator-777779d784-6ksxt\" (UID: \"21073be2-2012-4c88-9797-92b12b7ef7db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184846 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-registration-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.184865 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hklz\" (UniqueName: \"kubernetes.io/projected/c7314b49-e434-4c49-babe-7ebc3925639a-kube-api-access-2hklz\") pod \"marketplace-operator-79b997595-tfg8c\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.185488 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.190110 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/48236e40-cae1-46af-aa88-f5e038cc1a42-machine-approver-tls\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.191725 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c293129-16e9-4469-8307-f11ea13cd329-trusted-ca\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.192259 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b10cc94-4769-4c11-a94d-7b01d8f228b1-default-certificate\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.192844 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.193335 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.193529 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22751667-e260-4216-b78b-cc49c0ff5b5a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bwf7f\" (UID: \"22751667-e260-4216-b78b-cc49c0ff5b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.193818 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c293129-16e9-4469-8307-f11ea13cd329-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.194278 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22751667-e260-4216-b78b-cc49c0ff5b5a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bwf7f\" (UID: \"22751667-e260-4216-b78b-cc49c0ff5b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.195667 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b10cc94-4769-4c11-a94d-7b01d8f228b1-stats-auth\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.196750 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21073be2-2012-4c88-9797-92b12b7ef7db-config\") pod \"service-ca-operator-777779d784-6ksxt\" (UID: \"21073be2-2012-4c88-9797-92b12b7ef7db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.197506 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5f5483f1-06dc-4cf0-8fec-2e5f4e16e459-signing-key\") pod \"service-ca-9c57cc56f-4nz7f\" (UID: \"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459\") " pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.199813 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5bc82db-9313-414e-aa86-ff630456fb49-serving-cert\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.199969 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48236e40-cae1-46af-aa88-f5e038cc1a42-config\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.200696 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/467ea6f3-9b84-4075-a6e2-0adbd72b6ddb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jc24d\" (UID: \"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.201354 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-audit-policies\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.201929 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27hsn\" (UniqueName: \"kubernetes.io/projected/35531061-50ee-4972-bec4-75190551fbbe-kube-api-access-27hsn\") pod \"machine-config-controller-84d6567774-ckkdv\" (UID: \"35531061-50ee-4972-bec4-75190551fbbe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.203457 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb1f930b-6743-4222-900e-c2442f33be13-serving-cert\") pod \"openshift-config-operator-7777fb866f-l9dpq\" (UID: \"cb1f930b-6743-4222-900e-c2442f33be13\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.203927 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21073be2-2012-4c88-9797-92b12b7ef7db-serving-cert\") pod \"service-ca-operator-777779d784-6ksxt\" (UID: \"21073be2-2012-4c88-9797-92b12b7ef7db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.206359 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-etcd-client\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.207646 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-serving-cert\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.208996 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.209436 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c28c9a0-de21-4c01-a5d6-1f6490878a0b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sg7qz\" (UID: \"4c28c9a0-de21-4c01-a5d6-1f6490878a0b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.213657 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-encryption-config\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.216127 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c28c9a0-de21-4c01-a5d6-1f6490878a0b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sg7qz\" (UID: \"4c28c9a0-de21-4c01-a5d6-1f6490878a0b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.219589 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg8nb\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-kube-api-access-wg8nb\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.219834 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.225599 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.241325 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjvr\" (UniqueName: \"kubernetes.io/projected/87181d7a-94d9-4918-99d6-0fa95896bc05-kube-api-access-lfjvr\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptkj5\" (UID: \"87181d7a-94d9-4918-99d6-0fa95896bc05\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.253929 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgdnf\" (UniqueName: \"kubernetes.io/projected/22751667-e260-4216-b78b-cc49c0ff5b5a-kube-api-access-mgdnf\") pod \"kube-storage-version-migrator-operator-b67b599dd-bwf7f\" (UID: \"22751667-e260-4216-b78b-cc49c0ff5b5a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.257680 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-vnxkn"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.273316 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9gd4\" (UniqueName: \"kubernetes.io/projected/137dc523-385f-4afb-b972-66093e2e071e-kube-api-access-m9gd4\") pod \"machine-api-operator-5694c8668f-qnsxl\" (UID: \"137dc523-385f-4afb-b972-66093e2e071e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.285473 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.285727 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ldt9\" (UniqueName: \"kubernetes.io/projected/9630528a-7c8c-46cd-8bf4-e116f35b6911-kube-api-access-9ldt9\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.285763 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-plugins-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.285785 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b4612145-a0ad-4f87-b1e8-9f17248900bd-node-bootstrap-token\") pod \"machine-config-server-dx69r\" (UID: \"b4612145-a0ad-4f87-b1e8-9f17248900bd\") " pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.285808 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9630528a-7c8c-46cd-8bf4-e116f35b6911-apiservice-cert\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.285832 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2204982-5f64-473e-809f-17a61cf942d8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-64lb7\" (UID: \"a2204982-5f64-473e-809f-17a61cf942d8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.286032 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9hp8\" (UniqueName: \"kubernetes.io/projected/ade0f201-c317-40fc-bf82-293a53853ec4-kube-api-access-n9hp8\") pod \"ingress-canary-5bfwn\" (UID: \"ade0f201-c317-40fc-bf82-293a53853ec4\") " pod="openshift-ingress-canary/ingress-canary-5bfwn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.286387 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-socket-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.286483 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8qgm\" (UniqueName: \"kubernetes.io/projected/eb6e9847-508d-42d9-b429-2567547e41fe-kube-api-access-c8qgm\") pod \"dns-default-ltp2s\" (UID: \"eb6e9847-508d-42d9-b429-2567547e41fe\") " pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.286675 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-csi-data-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.286781 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrqk\" (UniqueName: \"kubernetes.io/projected/e2c07d05-3ff0-41e8-b792-7b349f553049-kube-api-access-ngrqk\") pod \"package-server-manager-789f6589d5-9fjbs\" (UID: \"e2c07d05-3ff0-41e8-b792-7b349f553049\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.286824 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9630528a-7c8c-46cd-8bf4-e116f35b6911-tmpfs\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.286868 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c07d05-3ff0-41e8-b792-7b349f553049-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9fjbs\" (UID: \"e2c07d05-3ff0-41e8-b792-7b349f553049\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.286915 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tfg8c\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.286971 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxcdl\" (UniqueName: \"kubernetes.io/projected/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-kube-api-access-mxcdl\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287004 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb6e9847-508d-42d9-b429-2567547e41fe-config-volume\") pod \"dns-default-ltp2s\" (UID: \"eb6e9847-508d-42d9-b429-2567547e41fe\") " pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287035 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdhdd\" (UniqueName: \"kubernetes.io/projected/951315e5-4217-41ba-8126-073fffe96d80-kube-api-access-tdhdd\") pod \"migrator-59844c95c7-crqfs\" (UID: \"951315e5-4217-41ba-8126-073fffe96d80\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287100 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9630528a-7c8c-46cd-8bf4-e116f35b6911-webhook-cert\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287209 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tfg8c\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287239 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-mountpoint-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287323 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkgqk\" (UniqueName: \"kubernetes.io/projected/51d04574-0631-403a-8bf5-4127787463d7-kube-api-access-pkgqk\") pod \"auto-csr-approver-29564284-6tsrt\" (UID: \"51d04574-0631-403a-8bf5-4127787463d7\") " pod="openshift-infra/auto-csr-approver-29564284-6tsrt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287356 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b4612145-a0ad-4f87-b1e8-9f17248900bd-certs\") pod \"machine-config-server-dx69r\" (UID: \"b4612145-a0ad-4f87-b1e8-9f17248900bd\") " pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287432 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs5lk\" (UniqueName: \"kubernetes.io/projected/b4612145-a0ad-4f87-b1e8-9f17248900bd-kube-api-access-rs5lk\") pod \"machine-config-server-dx69r\" (UID: \"b4612145-a0ad-4f87-b1e8-9f17248900bd\") " pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287456 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ade0f201-c317-40fc-bf82-293a53853ec4-cert\") pod \"ingress-canary-5bfwn\" (UID: \"ade0f201-c317-40fc-bf82-293a53853ec4\") " pod="openshift-ingress-canary/ingress-canary-5bfwn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287482 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb6e9847-508d-42d9-b429-2567547e41fe-metrics-tls\") pod \"dns-default-ltp2s\" (UID: \"eb6e9847-508d-42d9-b429-2567547e41fe\") " pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287516 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2204982-5f64-473e-809f-17a61cf942d8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-64lb7\" (UID: \"a2204982-5f64-473e-809f-17a61cf942d8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287732 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-mountpoint-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.289113 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb6e9847-508d-42d9-b429-2567547e41fe-config-volume\") pod \"dns-default-ltp2s\" (UID: \"eb6e9847-508d-42d9-b429-2567547e41fe\") " pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.290082 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-socket-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.290263 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-csi-data-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.290292 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-plugins-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.290443 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:51.790425988 +0000 UTC m=+208.309899067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.290870 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9630528a-7c8c-46cd-8bf4-e116f35b6911-tmpfs\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.291808 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tfg8c\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.292046 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-registration-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.292265 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2204982-5f64-473e-809f-17a61cf942d8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-64lb7\" (UID: \"a2204982-5f64-473e-809f-17a61cf942d8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.287554 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-registration-dir\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.293201 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hklz\" (UniqueName: \"kubernetes.io/projected/c7314b49-e434-4c49-babe-7ebc3925639a-kube-api-access-2hklz\") pod \"marketplace-operator-79b997595-tfg8c\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.293240 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2204982-5f64-473e-809f-17a61cf942d8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-64lb7\" (UID: \"a2204982-5f64-473e-809f-17a61cf942d8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.294369 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2204982-5f64-473e-809f-17a61cf942d8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-64lb7\" (UID: \"a2204982-5f64-473e-809f-17a61cf942d8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.294680 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c07d05-3ff0-41e8-b792-7b349f553049-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9fjbs\" (UID: \"e2c07d05-3ff0-41e8-b792-7b349f553049\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.295168 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9630528a-7c8c-46cd-8bf4-e116f35b6911-webhook-cert\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.295301 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b4612145-a0ad-4f87-b1e8-9f17248900bd-certs\") pod \"machine-config-server-dx69r\" (UID: \"b4612145-a0ad-4f87-b1e8-9f17248900bd\") " pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.295470 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9630528a-7c8c-46cd-8bf4-e116f35b6911-apiservice-cert\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.296292 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tfg8c\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.296482 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxlb4\" (UniqueName: \"kubernetes.io/projected/1ae5fd25-8e60-4287-8c79-260c3c82f5ae-kube-api-access-pxlb4\") pod \"apiserver-7bbb656c7d-hrvzq\" (UID: \"1ae5fd25-8e60-4287-8c79-260c3c82f5ae\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.299649 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ade0f201-c317-40fc-bf82-293a53853ec4-cert\") pod \"ingress-canary-5bfwn\" (UID: \"ade0f201-c317-40fc-bf82-293a53853ec4\") " pod="openshift-ingress-canary/ingress-canary-5bfwn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.303200 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b4612145-a0ad-4f87-b1e8-9f17248900bd-node-bootstrap-token\") pod \"machine-config-server-dx69r\" (UID: \"b4612145-a0ad-4f87-b1e8-9f17248900bd\") " pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.307689 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb6e9847-508d-42d9-b429-2567547e41fe-metrics-tls\") pod \"dns-default-ltp2s\" (UID: \"eb6e9847-508d-42d9-b429-2567547e41fe\") " pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.316978 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h589z\" (UniqueName: \"kubernetes.io/projected/119099cb-bd81-40db-8393-0ada3cbc7619-kube-api-access-h589z\") pod \"multus-admission-controller-857f4d67dd-k965r\" (UID: \"119099cb-bd81-40db-8393-0ada3cbc7619\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.342743 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.349028 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm9gh\" (UniqueName: \"kubernetes.io/projected/d5bc82db-9313-414e-aa86-ff630456fb49-kube-api-access-wm9gh\") pod \"controller-manager-879f6c89f-s5m7q\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.371527 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-bound-sa-token\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.392407 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgsgw\" (UniqueName: \"kubernetes.io/projected/2c293129-16e9-4469-8307-f11ea13cd329-kube-api-access-lgsgw\") pod \"ingress-operator-5b745b69d9-hmccb\" (UID: \"2c293129-16e9-4469-8307-f11ea13cd329\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.394119 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.394626 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:51.894613643 +0000 UTC m=+208.414086722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.412002 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7ngx\" (UniqueName: \"kubernetes.io/projected/7b10cc94-4769-4c11-a94d-7b01d8f228b1-kube-api-access-m7ngx\") pod \"router-default-5444994796-tvxdw\" (UID: \"7b10cc94-4769-4c11-a94d-7b01d8f228b1\") " pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.436481 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.440954 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.445041 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjr6v\" (UniqueName: \"kubernetes.io/projected/48236e40-cae1-46af-aa88-f5e038cc1a42-kube-api-access-cjr6v\") pod \"machine-approver-56656f9798-f5r2r\" (UID: \"48236e40-cae1-46af-aa88-f5e038cc1a42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.449049 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqxf\" (UniqueName: \"kubernetes.io/projected/21073be2-2012-4c88-9797-92b12b7ef7db-kube-api-access-ppqxf\") pod \"service-ca-operator-777779d784-6ksxt\" (UID: \"21073be2-2012-4c88-9797-92b12b7ef7db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.449262 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.471582 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/467ea6f3-9b84-4075-a6e2-0adbd72b6ddb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jc24d\" (UID: \"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.493530 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh7gk\" (UniqueName: \"kubernetes.io/projected/082fdb2c-88b4-42c3-8eeb-1817c0177198-kube-api-access-qh7gk\") pod \"openshift-controller-manager-operator-756b6f6bc6-hd8hn\" (UID: \"082fdb2c-88b4-42c3-8eeb-1817c0177198\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.494989 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.498345 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:51.998324354 +0000 UTC m=+208.517797433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.500633 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.516052 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c8f2\" (UniqueName: \"kubernetes.io/projected/860a9876-b8f6-4125-bd1c-51518eb10283-kube-api-access-2c8f2\") pod \"collect-profiles-29564280-d46pl\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.528844 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.535272 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.536433 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdpgn\" (UniqueName: \"kubernetes.io/projected/cb1f930b-6743-4222-900e-c2442f33be13-kube-api-access-zdpgn\") pod \"openshift-config-operator-7777fb866f-l9dpq\" (UID: \"cb1f930b-6743-4222-900e-c2442f33be13\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.542064 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.546982 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" Mar 18 18:05:51 crc kubenswrapper[5008]: W0318 18:05:51.550042 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35531061_50ee_4972_bec4_75190551fbbe.slice/crio-6c0cf96b5ffbb96d76e9b206e11c3b8d2412f6d450c3a12cf9115937ef60a74a WatchSource:0}: Error finding container 6c0cf96b5ffbb96d76e9b206e11c3b8d2412f6d450c3a12cf9115937ef60a74a: Status 404 returned error can't find the container with id 6c0cf96b5ffbb96d76e9b206e11c3b8d2412f6d450c3a12cf9115937ef60a74a Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.550496 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgmcw\" (UniqueName: \"kubernetes.io/projected/5f5483f1-06dc-4cf0-8fec-2e5f4e16e459-kube-api-access-bgmcw\") pod \"service-ca-9c57cc56f-4nz7f\" (UID: \"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459\") " pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.554103 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.569608 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.573998 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gmczr"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.575267 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhz2l\" (UniqueName: \"kubernetes.io/projected/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-kube-api-access-fhz2l\") pod \"oauth-openshift-558db77b4-z9ssp\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.581492 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.591866 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.595162 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.595531 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdhdd\" (UniqueName: \"kubernetes.io/projected/951315e5-4217-41ba-8126-073fffe96d80-kube-api-access-tdhdd\") pod \"migrator-59844c95c7-crqfs\" (UID: \"951315e5-4217-41ba-8126-073fffe96d80\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.598934 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.602901 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5"] Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.603125 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:52.103107607 +0000 UTC m=+208.622580686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.603519 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.611768 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.614426 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxcdl\" (UniqueName: \"kubernetes.io/projected/f3850a13-89c7-43a5-a58e-ad6fff6ba32f-kube-api-access-mxcdl\") pod \"csi-hostpathplugin-2w6x4\" (UID: \"f3850a13-89c7-43a5-a58e-ad6fff6ba32f\") " pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.615931 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.629425 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.631663 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9hp8\" (UniqueName: \"kubernetes.io/projected/ade0f201-c317-40fc-bf82-293a53853ec4-kube-api-access-n9hp8\") pod \"ingress-canary-5bfwn\" (UID: \"ade0f201-c317-40fc-bf82-293a53853ec4\") " pod="openshift-ingress-canary/ingress-canary-5bfwn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.634856 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.654276 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8qgm\" (UniqueName: \"kubernetes.io/projected/eb6e9847-508d-42d9-b429-2567547e41fe-kube-api-access-c8qgm\") pod \"dns-default-ltp2s\" (UID: \"eb6e9847-508d-42d9-b429-2567547e41fe\") " pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.669797 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.678492 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5bfwn" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.692140 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.702825 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.704149 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-s5pml"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.704422 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8jq26"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.704793 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.705170 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:52.205156328 +0000 UTC m=+208.724629407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.712179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs5lk\" (UniqueName: \"kubernetes.io/projected/b4612145-a0ad-4f87-b1e8-9f17248900bd-kube-api-access-rs5lk\") pod \"machine-config-server-dx69r\" (UID: \"b4612145-a0ad-4f87-b1e8-9f17248900bd\") " pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.713978 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkgqk\" (UniqueName: \"kubernetes.io/projected/51d04574-0631-403a-8bf5-4127787463d7-kube-api-access-pkgqk\") pod \"auto-csr-approver-29564284-6tsrt\" (UID: \"51d04574-0631-403a-8bf5-4127787463d7\") " pod="openshift-infra/auto-csr-approver-29564284-6tsrt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.718315 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrqk\" (UniqueName: \"kubernetes.io/projected/e2c07d05-3ff0-41e8-b792-7b349f553049-kube-api-access-ngrqk\") pod \"package-server-manager-789f6589d5-9fjbs\" (UID: \"e2c07d05-3ff0-41e8-b792-7b349f553049\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.719468 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qnsxl"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.723780 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.737213 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ldt9\" (UniqueName: \"kubernetes.io/projected/9630528a-7c8c-46cd-8bf4-e116f35b6911-kube-api-access-9ldt9\") pod \"packageserver-d55dfcdfc-g9mpk\" (UID: \"9630528a-7c8c-46cd-8bf4-e116f35b6911\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.757546 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.759385 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2204982-5f64-473e-809f-17a61cf942d8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-64lb7\" (UID: \"a2204982-5f64-473e-809f-17a61cf942d8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.770492 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" event={"ID":"fd727bd7-0dd3-44a6-90da-e63c81fb6194","Type":"ContainerStarted","Data":"8559e3902dfd8cdb110952bcea541503b5c302a1c20e1d792cdbf12390568811"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.770534 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" event={"ID":"fd727bd7-0dd3-44a6-90da-e63c81fb6194","Type":"ContainerStarted","Data":"50c59d0eef6d30005246106bed0c730e8c9df23b9558b400e7dc0f42b0eeef86"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.770545 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" event={"ID":"fd727bd7-0dd3-44a6-90da-e63c81fb6194","Type":"ContainerStarted","Data":"16dc636ac5eafbe5a567ec901efcd36243213dc04c90dbe3ffdcbbfc52f956e0"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.774580 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" event={"ID":"2d627459-6ea9-460c-8f9e-1fe47bcc59e1","Type":"ContainerStarted","Data":"d3ab982c38ef3135f6b677b93de4e8a93bb6080f03d29d68fa0c0dc7ebf7d5cf"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.775650 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" event={"ID":"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae","Type":"ContainerStarted","Data":"8371fbac342da0cf4e1ebcaddd703af0a93f8a132c178bd527027cbc4b2d3f86"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.775675 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" event={"ID":"102c9d92-61ef-4147-aaf2-7a9e6a1fbfae","Type":"ContainerStarted","Data":"ce9651c79db496537e05f17e367ba344e6ed43ca61ee63391799a0f6e799cc69"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.776036 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hklz\" (UniqueName: \"kubernetes.io/projected/c7314b49-e434-4c49-babe-7ebc3925639a-kube-api-access-2hklz\") pod \"marketplace-operator-79b997595-tfg8c\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.777540 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-n6cmd" event={"ID":"5cfb606e-0b94-4fef-b4c9-92cd528eab5c","Type":"ContainerStarted","Data":"8870abcfc5a6dee127997bb04dc39f8497c31c293dfbb7cea9681b683e1927ff"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.777580 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-n6cmd" event={"ID":"5cfb606e-0b94-4fef-b4c9-92cd528eab5c","Type":"ContainerStarted","Data":"d89fdf1d459ea6bb17d574dda0592ff4b9732baae22ad7f51ad244028d509514"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.778448 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.779407 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" event={"ID":"66c96a01-fca4-48c5-bb58-c5e954cdf1a2","Type":"ContainerStarted","Data":"c21faa1979d03990074fa4885012b1e061d8b882dbc1a61c6255b776faea49bc"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.785272 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" event={"ID":"225c4962-d9d2-4d32-85de-51872521d9a3","Type":"ContainerStarted","Data":"5a08372f8a3885b21bd1c9e871b024a355dce67eb58bd73e324f9677d8f43444"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.785345 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" event={"ID":"225c4962-d9d2-4d32-85de-51872521d9a3","Type":"ContainerStarted","Data":"d14784e67b968db0179cb4f6a40a3f0b6a70dc0140ac8993cbd6f427176d925c"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.786670 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.790770 5008 patch_prober.go:28] interesting pod/console-operator-58897d9998-n6cmd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.790839 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-n6cmd" podUID="5cfb606e-0b94-4fef-b4c9-92cd528eab5c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.793210 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq"] Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.803301 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" event={"ID":"0c14eea6-708c-4f37-a1e1-67ae91804b9d","Type":"ContainerStarted","Data":"f7f611907b47b2011199f911bf1f59126f1d1c7e7548dc42279f01b6065ccac0"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.804719 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" event={"ID":"87181d7a-94d9-4918-99d6-0fa95896bc05","Type":"ContainerStarted","Data":"ff1f01ef572c513ed04d27924e1a5b25c127d32bb836665e904064b44a7917a8"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.816978 5008 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dp77z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.817069 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" podUID="225c4962-d9d2-4d32-85de-51872521d9a3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.819633 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.820490 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:52.320465837 +0000 UTC m=+208.839938916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.831758 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.837972 5008 generic.go:334] "Generic (PLEG): container finished" podID="1d56a0c1-b18d-4e8c-acff-e57f000b3744" containerID="b7f2eca3c46bc965aefb95a43b9b0873611654b6ad662ffc2d4e31c3eee029b5" exitCode=0 Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.838502 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" event={"ID":"1d56a0c1-b18d-4e8c-acff-e57f000b3744","Type":"ContainerDied","Data":"b7f2eca3c46bc965aefb95a43b9b0873611654b6ad662ffc2d4e31c3eee029b5"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.838582 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" event={"ID":"1d56a0c1-b18d-4e8c-acff-e57f000b3744","Type":"ContainerStarted","Data":"e9ba9976daa8ae0a2bec885f32c417fc4d9951b4fb7adae1917001509d78427d"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.848222 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" event={"ID":"35531061-50ee-4972-bec4-75190551fbbe","Type":"ContainerStarted","Data":"6c0cf96b5ffbb96d76e9b206e11c3b8d2412f6d450c3a12cf9115937ef60a74a"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.851922 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gmczr" event={"ID":"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc","Type":"ContainerStarted","Data":"b2970a32373ae1b6aa4b6a5006f622a892412da13640072a06f1375a4d9c80a5"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.854875 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" event={"ID":"c3639ff6-daf7-495e-a7c7-b687a2bb9262","Type":"ContainerStarted","Data":"5b43b3ceefaabe2f98fd087ad6c7ee1fe114a7c0d620d9dce24f4a9127764eeb"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.854918 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" event={"ID":"c3639ff6-daf7-495e-a7c7-b687a2bb9262","Type":"ContainerStarted","Data":"9261df8515fa5dae27489e586d926769daf96261814f4ad75baba8e9bc6fcf66"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.863699 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" event={"ID":"e7856ce5-83fa-4265-84e5-73d635bcbd17","Type":"ContainerStarted","Data":"1fbe9d84c0aefd1d5b7f0d93425cda6947f1d5bf5233b5cba2806a64e44e68db"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.863809 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" event={"ID":"e7856ce5-83fa-4265-84e5-73d635bcbd17","Type":"ContainerStarted","Data":"6eae81784bfc8b1fbbbbf9783d0b71784a013c5ae69d8a91b168f69eadae0daa"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.867894 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" event={"ID":"57e1d26b-c119-48dc-8f78-35168b785d47","Type":"ContainerStarted","Data":"e1ce33bc1468338272ae6367eca7dcc01f7c80a80b7869ba0431f42317726b80"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.867935 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" event={"ID":"57e1d26b-c119-48dc-8f78-35168b785d47","Type":"ContainerStarted","Data":"8856dad0a1291e2da793f75baf009e47844966c38c96af3f02b3e7ad1204f06f"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.867945 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" event={"ID":"57e1d26b-c119-48dc-8f78-35168b785d47","Type":"ContainerStarted","Data":"2579deae379dcf0fde093f1f2032ac039e26296ef9a7272b94213e98b43769ef"} Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.929194 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:51 crc kubenswrapper[5008]: W0318 18:05:51.929937 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b10cc94_4769_4c11_a94d_7b01d8f228b1.slice/crio-f122cc36ecd53d4ace284c393e003fd216d0f68d04d62e9210aa809f9e41734d WatchSource:0}: Error finding container f122cc36ecd53d4ace284c393e003fd216d0f68d04d62e9210aa809f9e41734d: Status 404 returned error can't find the container with id f122cc36ecd53d4ace284c393e003fd216d0f68d04d62e9210aa809f9e41734d Mar 18 18:05:51 crc kubenswrapper[5008]: E0318 18:05:51.930601 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:52.430548589 +0000 UTC m=+208.950021668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.951439 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.965250 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.969653 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.983720 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" Mar 18 18:05:51 crc kubenswrapper[5008]: I0318 18:05:51.988428 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dx69r" Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.281066 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.281413 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:52.78140168 +0000 UTC m=+209.300874759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.309163 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz"] Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.328523 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f"] Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.385135 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.385295 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:52.88526273 +0000 UTC m=+209.404735809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.385846 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.386417 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:52.886399634 +0000 UTC m=+209.405872713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.487295 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.487537 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:52.987492841 +0000 UTC m=+209.506965920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.488731 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.489128 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:52.989113439 +0000 UTC m=+209.508586518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.515549 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl"] Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.541241 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb"] Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.592464 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.596496 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.096452292 +0000 UTC m=+209.615925371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.604712 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d"] Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.660862 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-vnxkn" podStartSLOduration=157.660820663 podStartE2EDuration="2m37.660820663s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:52.630029489 +0000 UTC m=+209.149502568" watchObservedRunningTime="2026-03-18 18:05:52.660820663 +0000 UTC m=+209.180293912" Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.728684 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.729110 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.229090741 +0000 UTC m=+209.748563890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.751145 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-s5m7q"] Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.777974 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-d7v25" podStartSLOduration=157.777947686 podStartE2EDuration="2m37.777947686s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:52.728691089 +0000 UTC m=+209.248164168" watchObservedRunningTime="2026-03-18 18:05:52.777947686 +0000 UTC m=+209.297420765" Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.800167 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k965r"] Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.840233 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.841034 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wq5x6" podStartSLOduration=157.840995607 podStartE2EDuration="2m37.840995607s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:52.8384177 +0000 UTC m=+209.357890779" watchObservedRunningTime="2026-03-18 18:05:52.840995607 +0000 UTC m=+209.360468686" Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.864417 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.364370969 +0000 UTC m=+209.883844048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.864575 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.865092 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.36508488 +0000 UTC m=+209.884557959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.875942 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c5wlw" podStartSLOduration=157.875913315 podStartE2EDuration="2m37.875913315s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:52.863059979 +0000 UTC m=+209.382533058" watchObservedRunningTime="2026-03-18 18:05:52.875913315 +0000 UTC m=+209.395386394" Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.905554 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" event={"ID":"48236e40-cae1-46af-aa88-f5e038cc1a42","Type":"ContainerStarted","Data":"8fa2faefd953233c960ac43602f3a1c95692509435aaa5af4bb6d81a74ff3f9d"} Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.924368 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt"] Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.935830 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs"] Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.969697 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.969816 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.469788871 +0000 UTC m=+209.989261950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.970484 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:52 crc kubenswrapper[5008]: E0318 18:05:52.971525 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.471511872 +0000 UTC m=+209.990984951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.972577 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" event={"ID":"4c28c9a0-de21-4c01-a5d6-1f6490878a0b","Type":"ContainerStarted","Data":"1046e5372e0b01a6517a321a72ecf3b193ee899e2269228f677ff7ba575dd803"} Mar 18 18:05:52 crc kubenswrapper[5008]: W0318 18:05:52.972746 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467ea6f3_9b84_4075_a6e2_0adbd72b6ddb.slice/crio-bfd76096dcd80580991fbb4e0070b0e48f8ad7fe3a3a9f569c2a24c92b4ea8dc WatchSource:0}: Error finding container bfd76096dcd80580991fbb4e0070b0e48f8ad7fe3a3a9f569c2a24c92b4ea8dc: Status 404 returned error can't find the container with id bfd76096dcd80580991fbb4e0070b0e48f8ad7fe3a3a9f569c2a24c92b4ea8dc Mar 18 18:05:52 crc kubenswrapper[5008]: I0318 18:05:52.976039 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4nz7f"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.010964 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.011591 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" podStartSLOduration=158.011575274 podStartE2EDuration="2m38.011575274s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:53.003399109 +0000 UTC m=+209.522872198" watchObservedRunningTime="2026-03-18 18:05:53.011575274 +0000 UTC m=+209.531048353" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.033167 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" event={"ID":"35531061-50ee-4972-bec4-75190551fbbe","Type":"ContainerStarted","Data":"902817abeb671dc220714af7997247504d0483c228aad742eb4f79f8f91fc6c9"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.047637 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gmczr" event={"ID":"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc","Type":"ContainerStarted","Data":"5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.062370 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" event={"ID":"2d627459-6ea9-460c-8f9e-1fe47bcc59e1","Type":"ContainerStarted","Data":"e30faf9ecb97e41b5533da248b99ef157d71ead8c5ff4400255460ac05c204d0"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.074119 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.075396 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.575376118 +0000 UTC m=+210.094849197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.077835 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ltp2s"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.092652 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-z9ssp"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.112878 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.134852 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" event={"ID":"2c293129-16e9-4469-8307-f11ea13cd329","Type":"ContainerStarted","Data":"2e7f9234062afe795cadfe0b464edeb4659e3385b814618d61bb4ebc65ebcb0d"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.162105 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2w6x4"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.175668 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.176014 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.675999376 +0000 UTC m=+210.195472455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.180022 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" event={"ID":"87181d7a-94d9-4918-99d6-0fa95896bc05","Type":"ContainerStarted","Data":"8422b24ff1ef9ab1c9d9365b88c42570c81e6909ba908b9a204b1fd1090a2421"} Mar 18 18:05:53 crc kubenswrapper[5008]: W0318 18:05:53.205240 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4612145_a0ad_4f87_b1e8_9f17248900bd.slice/crio-e85ffce6472425427d1544285a480fceeef078aa93244f1f2e1fde912e6204b3 WatchSource:0}: Error finding container e85ffce6472425427d1544285a480fceeef078aa93244f1f2e1fde912e6204b3: Status 404 returned error can't find the container with id e85ffce6472425427d1544285a480fceeef078aa93244f1f2e1fde912e6204b3 Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.217314 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" event={"ID":"66c96a01-fca4-48c5-bb58-c5e954cdf1a2","Type":"ContainerStarted","Data":"40b97e9c5c52435557cc4f5e0f5c0e8942d6230b9b04aab90b8196f8ec873948"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.217595 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.226429 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" event={"ID":"22751667-e260-4216-b78b-cc49c0ff5b5a","Type":"ContainerStarted","Data":"db74818ee3a93d3a2d6dd5e00e3659fa1948fd92d92449d22e71555b00bfbb47"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.233249 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2g6md" podStartSLOduration=158.233223873 podStartE2EDuration="2m38.233223873s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:53.226825161 +0000 UTC m=+209.746298240" watchObservedRunningTime="2026-03-18 18:05:53.233223873 +0000 UTC m=+209.752696972" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.248225 5008 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-9zlx9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.248274 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" podUID="66c96a01-fca4-48c5-bb58-c5e954cdf1a2" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.265937 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" event={"ID":"137dc523-385f-4afb-b972-66093e2e071e","Type":"ContainerStarted","Data":"57cb20ce4fc846d4a66f00b4930a01788965c8bbd9682fbd8da702f0ac74a756"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.286518 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.287353 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.787335396 +0000 UTC m=+210.306808475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.374272 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tfg8c"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.380846 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5bfwn"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.391911 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.392873 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.892860081 +0000 UTC m=+210.412333150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.395606 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-n6cmd" podStartSLOduration=158.395590903 podStartE2EDuration="2m38.395590903s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:53.392679526 +0000 UTC m=+209.912152605" watchObservedRunningTime="2026-03-18 18:05:53.395590903 +0000 UTC m=+209.915063982" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.399837 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tvxdw" event={"ID":"7b10cc94-4769-4c11-a94d-7b01d8f228b1","Type":"ContainerStarted","Data":"f122cc36ecd53d4ace284c393e003fd216d0f68d04d62e9210aa809f9e41734d"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.432861 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" event={"ID":"ebf037f0-6468-45ec-a599-49652456a53f","Type":"ContainerStarted","Data":"bff39da6561ec4c187c6a101ea0ac4c1330c2d72be030971a0507aaf613f0632"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.436978 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" event={"ID":"0c14eea6-708c-4f37-a1e1-67ae91804b9d","Type":"ContainerStarted","Data":"f7b1f04ab9196aa3e4ac494b4c2ddf8f0640f90c0abde11724f89dfbd3757dd0"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.437869 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.447174 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" event={"ID":"1ae5fd25-8e60-4287-8c79-260c3c82f5ae","Type":"ContainerStarted","Data":"032a4d16941707a8fa9da35cd08a5232cdb4d024f1356b5e1e1318a3a483944c"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.448033 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.458690 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.467407 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptkj5" podStartSLOduration=158.467385946 podStartE2EDuration="2m38.467385946s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:53.464578292 +0000 UTC m=+209.984051371" watchObservedRunningTime="2026-03-18 18:05:53.467385946 +0000 UTC m=+209.986859025" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.489480 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-s5pml" event={"ID":"31a94b93-89d6-4fab-87d7-05ecd80f55ec","Type":"ContainerStarted","Data":"7bdee570e95ef99d6355bf1f046e031c32304e65203de6f854e4486f6ba454c8"} Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.492384 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.492722 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:53.992708386 +0000 UTC m=+210.512181465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.515169 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-gmczr" podStartSLOduration=158.515155929 podStartE2EDuration="2m38.515155929s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:53.514104928 +0000 UTC m=+210.033578007" watchObservedRunningTime="2026-03-18 18:05:53.515155929 +0000 UTC m=+210.034629008" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.515353 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.520010 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-n6cmd" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.548102 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" podStartSLOduration=158.548046426 podStartE2EDuration="2m38.548046426s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:53.545876731 +0000 UTC m=+210.065349810" watchObservedRunningTime="2026-03-18 18:05:53.548046426 +0000 UTC m=+210.067519515" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.594266 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.594960 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.094938402 +0000 UTC m=+210.614411481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.606813 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tpw24" podStartSLOduration=158.606776878 podStartE2EDuration="2m38.606776878s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:53.594074447 +0000 UTC m=+210.113547526" watchObservedRunningTime="2026-03-18 18:05:53.606776878 +0000 UTC m=+210.126249957" Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.702725 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.717331 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.717679 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.217651343 +0000 UTC m=+210.737124422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.717890 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.718246 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.218239511 +0000 UTC m=+210.737712590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.739274 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564284-6tsrt"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.769727 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7"] Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.818631 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.818814 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.318778827 +0000 UTC m=+210.838251906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.819426 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.819856 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.319839149 +0000 UTC m=+210.839312228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.888923 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.921357 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.921464 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.421445036 +0000 UTC m=+210.940918115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:53 crc kubenswrapper[5008]: I0318 18:05:53.921725 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:53 crc kubenswrapper[5008]: E0318 18:05:53.922054 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.422047504 +0000 UTC m=+210.941520583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.023115 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.023434 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.523420155 +0000 UTC m=+211.042893224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.125255 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.126037 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.626024873 +0000 UTC m=+211.145497942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.231129 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.244291 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.74426993 +0000 UTC m=+211.263743009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.333664 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.333971 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.83395599 +0000 UTC m=+211.353429069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.436032 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.436671 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:54.93665492 +0000 UTC m=+211.456127999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.460126 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.460193 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.538471 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.539057 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.039045562 +0000 UTC m=+211.558518641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.539816 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" event={"ID":"51d04574-0631-403a-8bf5-4127787463d7","Type":"ContainerStarted","Data":"005063dad0bdd4186802503e453f79d06cce557cc0ca364ed8c98b6d1f00b40c"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.579024 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tvxdw" event={"ID":"7b10cc94-4769-4c11-a94d-7b01d8f228b1","Type":"ContainerStarted","Data":"78675f54cd92b711743d226179cbd7b7bbb1a5653de10b82434b22edeabe014a"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.611297 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" event={"ID":"082fdb2c-88b4-42c3-8eeb-1817c0177198","Type":"ContainerStarted","Data":"6c75676604bac1e8f4d9f5ab8a060929eefcd7d050250001bfb254f5e4889add"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.611351 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" event={"ID":"082fdb2c-88b4-42c3-8eeb-1817c0177198","Type":"ContainerStarted","Data":"c39047497bc04a00ebe2e187434df2f549c9a8d19934da5a1187ea015420eae9"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.640506 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.640669 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.140646549 +0000 UTC m=+211.660119628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.640933 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.641276 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.141265508 +0000 UTC m=+211.660738587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.670490 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" event={"ID":"1d56a0c1-b18d-4e8c-acff-e57f000b3744","Type":"ContainerStarted","Data":"28f337756397c8232c3435e492ab526e1f269a7f63e45ff0141be64fb0e760be"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.703927 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" event={"ID":"e2c07d05-3ff0-41e8-b792-7b349f553049","Type":"ContainerStarted","Data":"3e3c9908979651bced0df3e00d84570775d5492e3df24110c1e3a4147090e1e4"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.742185 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.743457 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.243430602 +0000 UTC m=+211.762903741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.747798 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5bfwn" event={"ID":"ade0f201-c317-40fc-bf82-293a53853ec4","Type":"ContainerStarted","Data":"0e563cb65510dc6811f9c452eb85d41f11dc97b7bb91a7d35dbaedcb968ca89d"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.786874 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" event={"ID":"119099cb-bd81-40db-8393-0ada3cbc7619","Type":"ContainerStarted","Data":"5e73e3c84229f29ecbfc09c39f3165cce27dc08a2a4e947a2526e2b9b6c40437"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.803286 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-hd8hn" podStartSLOduration=159.803266007 podStartE2EDuration="2m39.803266007s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:54.79869755 +0000 UTC m=+211.318170639" watchObservedRunningTime="2026-03-18 18:05:54.803266007 +0000 UTC m=+211.322739086" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.845006 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.845344 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.345332819 +0000 UTC m=+211.864805888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.848873 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tvxdw" podStartSLOduration=159.848849474 podStartE2EDuration="2m39.848849474s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:54.845885435 +0000 UTC m=+211.365358514" watchObservedRunningTime="2026-03-18 18:05:54.848849474 +0000 UTC m=+211.368322553" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.854230 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" event={"ID":"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb","Type":"ContainerStarted","Data":"bc056313de3ce733b98e0336e99054a3bd50cb97437f245c7433ef49e9af83f8"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.854274 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" event={"ID":"467ea6f3-9b84-4075-a6e2-0adbd72b6ddb","Type":"ContainerStarted","Data":"bfd76096dcd80580991fbb4e0070b0e48f8ad7fe3a3a9f569c2a24c92b4ea8dc"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.854583 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xth4j"] Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.855438 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.864802 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" event={"ID":"22751667-e260-4216-b78b-cc49c0ff5b5a","Type":"ContainerStarted","Data":"f183eb75d1853379d9ebb89165a75e8916ebd9c379077dba9ed1f256b3c633e8"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.865814 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.882989 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" event={"ID":"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459","Type":"ContainerStarted","Data":"1582c4ce9c545856eefb4ab06530a232510b7f42b38a8f2f00cfe27ab1e2e057"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.883029 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" event={"ID":"5f5483f1-06dc-4cf0-8fec-2e5f4e16e459","Type":"ContainerStarted","Data":"fed905ba2efa592993a4b6fd242a52d81d756d585556b3614d0d18c8405114a7"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.900355 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jc24d" podStartSLOduration=159.900339849 podStartE2EDuration="2m39.900339849s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:54.900085681 +0000 UTC m=+211.419558760" watchObservedRunningTime="2026-03-18 18:05:54.900339849 +0000 UTC m=+211.419812928" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.908189 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ltp2s" event={"ID":"eb6e9847-508d-42d9-b429-2567547e41fe","Type":"ContainerStarted","Data":"05fd865461ea7131a76b24eb6adcee197b2f428d0fa9eed93d1e0dc27ab54f60"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.908449 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.910446 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xth4j"] Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.960989 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:54 crc kubenswrapper[5008]: E0318 18:05:54.961442 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.461420131 +0000 UTC m=+211.980893210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.961762 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p7jc\" (UniqueName: \"kubernetes.io/projected/420d2432-4da0-4be3-8489-56c06a682e03-kube-api-access-5p7jc\") pod \"certified-operators-xth4j\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.962259 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-utilities\") pod \"certified-operators-xth4j\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.962294 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-catalog-content\") pod \"certified-operators-xth4j\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.990651 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-s5pml" event={"ID":"31a94b93-89d6-4fab-87d7-05ecd80f55ec","Type":"ContainerStarted","Data":"364a2fb5676a9033b8198000b09e2283ed4fa9c7ab7bb51b891be8d620a496a8"} Mar 18 18:05:54 crc kubenswrapper[5008]: I0318 18:05:54.992160 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-s5pml" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.001512 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4nz7f" podStartSLOduration=160.001489693 podStartE2EDuration="2m40.001489693s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:54.960021829 +0000 UTC m=+211.479494908" watchObservedRunningTime="2026-03-18 18:05:55.001489693 +0000 UTC m=+211.520962762" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.013467 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-s5pml container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.013618 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-s5pml" podUID="31a94b93-89d6-4fab-87d7-05ecd80f55ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.014132 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" event={"ID":"f3850a13-89c7-43a5-a58e-ad6fff6ba32f","Type":"ContainerStarted","Data":"65ee6934d917e3a33b7238a1ab9b08ecca10a37d7db8a714841ef44ccd3dcbff"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.023406 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dx69r" event={"ID":"b4612145-a0ad-4f87-b1e8-9f17248900bd","Type":"ContainerStarted","Data":"e85ffce6472425427d1544285a480fceeef078aa93244f1f2e1fde912e6204b3"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.035681 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" event={"ID":"48236e40-cae1-46af-aa88-f5e038cc1a42","Type":"ContainerStarted","Data":"8ecc0bbe364fbd881a31c99dbfc214eb57de7b03fd2e7acb4cc464c32e89d9aa"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.039155 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b59s5"] Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.042625 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" event={"ID":"2d627459-6ea9-460c-8f9e-1fe47bcc59e1","Type":"ContainerStarted","Data":"6c55ce29e572cde7dcf8f90946ebb00d6ad35111c8cc551bfa67d69f46f3a181"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.042976 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.046687 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b59s5"] Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.047203 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.067071 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p7jc\" (UniqueName: \"kubernetes.io/projected/420d2432-4da0-4be3-8489-56c06a682e03-kube-api-access-5p7jc\") pod \"certified-operators-xth4j\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.067115 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-utilities\") pod \"certified-operators-xth4j\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.067136 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-catalog-content\") pod \"certified-operators-xth4j\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.067199 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.069289 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-utilities\") pod \"certified-operators-xth4j\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.069509 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-catalog-content\") pod \"certified-operators-xth4j\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:55 crc kubenswrapper[5008]: E0318 18:05:55.069943 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.569931876 +0000 UTC m=+212.089404955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.070855 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bwf7f" podStartSLOduration=160.070833323 podStartE2EDuration="2m40.070833323s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.045765941 +0000 UTC m=+211.565239020" watchObservedRunningTime="2026-03-18 18:05:55.070833323 +0000 UTC m=+211.590306392" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.103591 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p7jc\" (UniqueName: \"kubernetes.io/projected/420d2432-4da0-4be3-8489-56c06a682e03-kube-api-access-5p7jc\") pod \"certified-operators-xth4j\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.149874 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" event={"ID":"35531061-50ee-4972-bec4-75190551fbbe","Type":"ContainerStarted","Data":"747026522bdd97847f42041c519134a2becc2d2e687eed46279a177999ed1f69"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.155842 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" event={"ID":"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7","Type":"ContainerStarted","Data":"c4a1881be879c42a40aa76510b9acbe29c170e15f6e4368e590e2ad54410f9fe"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.156902 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.168808 5008 generic.go:334] "Generic (PLEG): container finished" podID="1ae5fd25-8e60-4287-8c79-260c3c82f5ae" containerID="c0163560350d763327ea22b6f8907f110ec33ecc4136ea1923f351cf9a78fdf0" exitCode=0 Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.169044 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.169330 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-utilities\") pod \"community-operators-b59s5\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.169374 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-947tz\" (UniqueName: \"kubernetes.io/projected/9209339f-be23-444c-a635-04920e6a0cf6-kube-api-access-947tz\") pod \"community-operators-b59s5\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.169433 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-catalog-content\") pod \"community-operators-b59s5\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.169570 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" event={"ID":"1ae5fd25-8e60-4287-8c79-260c3c82f5ae","Type":"ContainerDied","Data":"c0163560350d763327ea22b6f8907f110ec33ecc4136ea1923f351cf9a78fdf0"} Mar 18 18:05:55 crc kubenswrapper[5008]: E0318 18:05:55.169934 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.669919235 +0000 UTC m=+212.189392314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.176038 5008 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-z9ssp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body= Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.176390 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" podUID="1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.196259 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fbqw7" podStartSLOduration=160.196234755 podStartE2EDuration="2m40.196234755s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.18309525 +0000 UTC m=+211.702568339" watchObservedRunningTime="2026-03-18 18:05:55.196234755 +0000 UTC m=+211.715707824" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.196476 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-s5pml" podStartSLOduration=160.196468392 podStartE2EDuration="2m40.196468392s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.164168583 +0000 UTC m=+211.683641652" watchObservedRunningTime="2026-03-18 18:05:55.196468392 +0000 UTC m=+211.715941471" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.204872 5008 generic.go:334] "Generic (PLEG): container finished" podID="cb1f930b-6743-4222-900e-c2442f33be13" containerID="4a3b5b75703450ee4b6c93197a111a2256a25618397dda9542dc61c81b39bc77" exitCode=0 Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.204966 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" event={"ID":"cb1f930b-6743-4222-900e-c2442f33be13","Type":"ContainerDied","Data":"4a3b5b75703450ee4b6c93197a111a2256a25618397dda9542dc61c81b39bc77"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.204995 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" event={"ID":"cb1f930b-6743-4222-900e-c2442f33be13","Type":"ContainerStarted","Data":"2a3e200f7ee93d3e1dd5643ae2e3a14ade5abd0a3bfb4840f8d12c28b91c9910"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.217464 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" event={"ID":"4c28c9a0-de21-4c01-a5d6-1f6490878a0b","Type":"ContainerStarted","Data":"e6746a2de91a959baf743aef5fb5bfd4b3dc07ca926aa0bc08b8474cae45e89a"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.223788 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" event={"ID":"a2204982-5f64-473e-809f-17a61cf942d8","Type":"ContainerStarted","Data":"0be225324ae3977565979214bcc15a6874af7d5f3e2e34a4b53274812ce1b7bd"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.223853 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dx69r" podStartSLOduration=7.223837762 podStartE2EDuration="7.223837762s" podCreationTimestamp="2026-03-18 18:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.212592925 +0000 UTC m=+211.732066004" watchObservedRunningTime="2026-03-18 18:05:55.223837762 +0000 UTC m=+211.743310842" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.232069 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" event={"ID":"c7314b49-e434-4c49-babe-7ebc3925639a","Type":"ContainerStarted","Data":"8c80ce51085fb7d45fefeea81af83e89a625c064305162972b17ab8dfc157624"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.234019 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.237971 5008 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tfg8c container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.238045 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.249315 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fln9j"] Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.250273 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.254850 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" event={"ID":"d5bc82db-9313-414e-aa86-ff630456fb49","Type":"ContainerStarted","Data":"748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.254897 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" event={"ID":"d5bc82db-9313-414e-aa86-ff630456fb49","Type":"ContainerStarted","Data":"1638a939c18db509f8e83a3064ddd6e98dd75e07ac74865acc82c339201f5648"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.255317 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.257052 5008 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-s5m7q container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.257088 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" podUID="d5bc82db-9313-414e-aa86-ff630456fb49" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.257504 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" event={"ID":"860a9876-b8f6-4125-bd1c-51518eb10283","Type":"ContainerStarted","Data":"a6d520fda5cf8a6f5d8092a6215019730978014336857c00e8726c3623486cd0"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.257530 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" event={"ID":"860a9876-b8f6-4125-bd1c-51518eb10283","Type":"ContainerStarted","Data":"290c0c01946447e2054a22a470a448e5c3c44fc4a6d379c060406fc0135062e3"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.273129 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-catalog-content\") pod \"community-operators-b59s5\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.273392 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.273416 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-utilities\") pod \"community-operators-b59s5\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.273432 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-947tz\" (UniqueName: \"kubernetes.io/projected/9209339f-be23-444c-a635-04920e6a0cf6-kube-api-access-947tz\") pod \"community-operators-b59s5\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.274998 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-catalog-content\") pod \"community-operators-b59s5\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: E0318 18:05:55.277978 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.777957696 +0000 UTC m=+212.297430775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.278161 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-utilities\") pod \"community-operators-b59s5\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.279255 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.288208 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" podStartSLOduration=160.288186333 podStartE2EDuration="2m40.288186333s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.277399629 +0000 UTC m=+211.796872698" watchObservedRunningTime="2026-03-18 18:05:55.288186333 +0000 UTC m=+211.807659422" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.289839 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fln9j"] Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.290099 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" event={"ID":"21073be2-2012-4c88-9797-92b12b7ef7db","Type":"ContainerStarted","Data":"b59c8ec434fd6c56880b548b1349a32d581ae0651d942dfd3be4fa188ca129ba"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.290133 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" event={"ID":"21073be2-2012-4c88-9797-92b12b7ef7db","Type":"ContainerStarted","Data":"e81879e926f4bd796092a1f65e98a1320e267aec534bcfaec7efc4073a5581ab"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.304580 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" event={"ID":"137dc523-385f-4afb-b972-66093e2e071e","Type":"ContainerStarted","Data":"cb7bd5524ef44825aeb14b498c98a38bc4c800cd61aced847ae4db4b2db0df9d"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.311919 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-947tz\" (UniqueName: \"kubernetes.io/projected/9209339f-be23-444c-a635-04920e6a0cf6-kube-api-access-947tz\") pod \"community-operators-b59s5\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.315885 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ckkdv" podStartSLOduration=160.315867013 podStartE2EDuration="2m40.315867013s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.314821822 +0000 UTC m=+211.834294901" watchObservedRunningTime="2026-03-18 18:05:55.315867013 +0000 UTC m=+211.835340092" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.316619 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs" event={"ID":"951315e5-4217-41ba-8126-073fffe96d80","Type":"ContainerStarted","Data":"5687c6b8898604c09caf5401694505bf8c8b4f1d5df26097daded3659b64915e"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.316647 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs" event={"ID":"951315e5-4217-41ba-8126-073fffe96d80","Type":"ContainerStarted","Data":"35c75edcd5fd96daff25896abcf9753d704ae5e19b570f519f5ad24f721864e5"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.374655 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.375106 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-catalog-content\") pod \"certified-operators-fln9j\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.375140 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-utilities\") pod \"certified-operators-fln9j\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.375245 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj8sf\" (UniqueName: \"kubernetes.io/projected/d548a7a8-a808-45d3-91be-b0e9242383ec-kube-api-access-xj8sf\") pod \"certified-operators-fln9j\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: E0318 18:05:55.379517 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.879489141 +0000 UTC m=+212.398962220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.398500 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" event={"ID":"9630528a-7c8c-46cd-8bf4-e116f35b6911","Type":"ContainerStarted","Data":"a07731ba3802499ec39d28b4cb0801f0d50dcaf57db91075911c9b0552e17dc2"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.399009 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.405525 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" event={"ID":"2c293129-16e9-4469-8307-f11ea13cd329","Type":"ContainerStarted","Data":"9f8a5491449a8a379323e3ca95f36b9fcf30ad1f9dffe5509b870ded97657af0"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.410783 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sg7qz" podStartSLOduration=160.41076685 podStartE2EDuration="2m40.41076685s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.410238094 +0000 UTC m=+211.929711183" watchObservedRunningTime="2026-03-18 18:05:55.41076685 +0000 UTC m=+211.930239929" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.411359 5008 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-g9mpk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.411416 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" podUID="9630528a-7c8c-46cd-8bf4-e116f35b6911" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.448738 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" podStartSLOduration=160.448719018 podStartE2EDuration="2m40.448719018s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.43646918 +0000 UTC m=+211.955942260" watchObservedRunningTime="2026-03-18 18:05:55.448719018 +0000 UTC m=+211.968192097" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.449904 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.480670 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.480843 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-catalog-content\") pod \"certified-operators-fln9j\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.480875 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-utilities\") pod \"certified-operators-fln9j\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.480973 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj8sf\" (UniqueName: \"kubernetes.io/projected/d548a7a8-a808-45d3-91be-b0e9242383ec-kube-api-access-xj8sf\") pod \"certified-operators-fln9j\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.481520 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-utilities\") pod \"certified-operators-fln9j\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.482034 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-catalog-content\") pod \"certified-operators-fln9j\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: E0318 18:05:55.482585 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:55.982537712 +0000 UTC m=+212.502010781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.496322 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" event={"ID":"ebf037f0-6468-45ec-a599-49652456a53f","Type":"ContainerStarted","Data":"41df60d412b0adde13d17a701268c8e84f6804a4086216d7fe25b01b85e33f9d"} Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.496370 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d2kl6"] Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.514706 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" podStartSLOduration=160.514679346 podStartE2EDuration="2m40.514679346s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.506655206 +0000 UTC m=+212.026128285" watchObservedRunningTime="2026-03-18 18:05:55.514679346 +0000 UTC m=+212.034152425" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.552717 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d2kl6"] Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.552798 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9zlx9" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.552896 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.555680 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.564472 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj8sf\" (UniqueName: \"kubernetes.io/projected/d548a7a8-a808-45d3-91be-b0e9242383ec-kube-api-access-xj8sf\") pod \"certified-operators-fln9j\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.564549 5008 patch_prober.go:28] interesting pod/router-default-5444994796-tvxdw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:05:55 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Mar 18 18:05:55 crc kubenswrapper[5008]: [+]process-running ok Mar 18 18:05:55 crc kubenswrapper[5008]: healthz check failed Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.564599 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tvxdw" podUID="7b10cc94-4769-4c11-a94d-7b01d8f228b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.571745 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6ksxt" podStartSLOduration=160.571731858 podStartE2EDuration="2m40.571731858s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.571669686 +0000 UTC m=+212.091142765" watchObservedRunningTime="2026-03-18 18:05:55.571731858 +0000 UTC m=+212.091204937" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.587476 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:55 crc kubenswrapper[5008]: E0318 18:05:55.589242 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.089220302 +0000 UTC m=+212.608693391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.603235 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" podStartSLOduration=160.603215272 podStartE2EDuration="2m40.603215272s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.603022896 +0000 UTC m=+212.122495985" watchObservedRunningTime="2026-03-18 18:05:55.603215272 +0000 UTC m=+212.122688351" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.616165 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.688633 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" podStartSLOduration=160.688601392 podStartE2EDuration="2m40.688601392s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.634647024 +0000 UTC m=+212.154120103" watchObservedRunningTime="2026-03-18 18:05:55.688601392 +0000 UTC m=+212.208074471" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.695754 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.695814 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-catalog-content\") pod \"community-operators-d2kl6\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.695858 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-utilities\") pod \"community-operators-d2kl6\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.695896 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk5n2\" (UniqueName: \"kubernetes.io/projected/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-kube-api-access-gk5n2\") pod \"community-operators-d2kl6\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: E0318 18:05:55.696147 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.196137418 +0000 UTC m=+212.715610497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.724038 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs" podStartSLOduration=160.724019405 podStartE2EDuration="2m40.724019405s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.721139418 +0000 UTC m=+212.240612497" watchObservedRunningTime="2026-03-18 18:05:55.724019405 +0000 UTC m=+212.243492484" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.727116 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" podStartSLOduration=160.727099477 podStartE2EDuration="2m40.727099477s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.689936212 +0000 UTC m=+212.209409291" watchObservedRunningTime="2026-03-18 18:05:55.727099477 +0000 UTC m=+212.246572556" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.761964 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" podStartSLOduration=160.761947503 podStartE2EDuration="2m40.761947503s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.761091677 +0000 UTC m=+212.280564756" watchObservedRunningTime="2026-03-18 18:05:55.761947503 +0000 UTC m=+212.281420582" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.797126 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:55 crc kubenswrapper[5008]: E0318 18:05:55.798382 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.298363615 +0000 UTC m=+212.817836694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.798438 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-catalog-content\") pod \"community-operators-d2kl6\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.798497 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-utilities\") pod \"community-operators-d2kl6\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.798544 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk5n2\" (UniqueName: \"kubernetes.io/projected/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-kube-api-access-gk5n2\") pod \"community-operators-d2kl6\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.799414 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-utilities\") pod \"community-operators-d2kl6\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.799485 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-catalog-content\") pod \"community-operators-d2kl6\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.844933 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk5n2\" (UniqueName: \"kubernetes.io/projected/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-kube-api-access-gk5n2\") pod \"community-operators-d2kl6\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.835852 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8jq26" podStartSLOduration=160.835826379 podStartE2EDuration="2m40.835826379s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:55.824064276 +0000 UTC m=+212.343537355" watchObservedRunningTime="2026-03-18 18:05:55.835826379 +0000 UTC m=+212.355299458" Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.900275 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:55 crc kubenswrapper[5008]: E0318 18:05:55.901040 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.401028144 +0000 UTC m=+212.920501223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:55 crc kubenswrapper[5008]: I0318 18:05:55.983763 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.006574 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.006740 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.506693504 +0000 UTC m=+213.026166583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.006809 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.007267 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.507259701 +0000 UTC m=+213.026732780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.043494 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b59s5"] Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.112088 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.112922 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.61290698 +0000 UTC m=+213.132380059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.152262 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xth4j"] Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.219352 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.219700 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.719683493 +0000 UTC m=+213.239156572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.329272 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.330028 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.830006462 +0000 UTC m=+213.349479551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.437225 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.437842 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:56.937816406 +0000 UTC m=+213.457289485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.475252 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fln9j"] Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.538606 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.539324 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.03930765 +0000 UTC m=+213.558780729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.547609 5008 patch_prober.go:28] interesting pod/router-default-5444994796-tvxdw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:05:56 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Mar 18 18:05:56 crc kubenswrapper[5008]: [+]process-running ok Mar 18 18:05:56 crc kubenswrapper[5008]: healthz check failed Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.547669 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tvxdw" podUID="7b10cc94-4769-4c11-a94d-7b01d8f228b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.576948 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" event={"ID":"a2204982-5f64-473e-809f-17a61cf942d8","Type":"ContainerStarted","Data":"41c5dc228a40778cf141f255c6e553a376ae3b42f5e2fa482cbb3d0300c78d46"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.579455 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" event={"ID":"1ae5fd25-8e60-4287-8c79-260c3c82f5ae","Type":"ContainerStarted","Data":"19808b772a0e1251d42dfa02c0919000376089c1be57c95bdd06b725cdabc6dc"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.598805 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" event={"ID":"119099cb-bd81-40db-8393-0ada3cbc7619","Type":"ContainerStarted","Data":"a9dcee9f7c792a50e701279893e77fe74db8a28bb4bf0af3cbf048d8d6b0c2e2"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.598854 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" event={"ID":"119099cb-bd81-40db-8393-0ada3cbc7619","Type":"ContainerStarted","Data":"4e76b1be641171a7810aa146494e13a17d42a22d557c63e533d3f6a9fb79e5df"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.606731 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-64lb7" podStartSLOduration=161.606717242 podStartE2EDuration="2m41.606717242s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:56.604953839 +0000 UTC m=+213.124426928" watchObservedRunningTime="2026-03-18 18:05:56.606717242 +0000 UTC m=+213.126190321" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.615213 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" event={"ID":"48236e40-cae1-46af-aa88-f5e038cc1a42","Type":"ContainerStarted","Data":"7596635e25cd8df760ded40702fee4d5f5031690eae326f991662e09d3ffae18"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.642375 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.642894 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.142882277 +0000 UTC m=+213.662355356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.650388 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b59s5" event={"ID":"9209339f-be23-444c-a635-04920e6a0cf6","Type":"ContainerStarted","Data":"ad829c9e61acd34611564eef21f78c28fdd5f6128296124ce6d56915ab89d1bb"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.666955 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-k965r" podStartSLOduration=161.666941018 podStartE2EDuration="2m41.666941018s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:56.664906487 +0000 UTC m=+213.184379566" watchObservedRunningTime="2026-03-18 18:05:56.666941018 +0000 UTC m=+213.186414087" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.671393 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d2kl6"] Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.691695 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5r2r" podStartSLOduration=161.69167488 podStartE2EDuration="2m41.69167488s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:56.691185406 +0000 UTC m=+213.210658485" watchObservedRunningTime="2026-03-18 18:05:56.69167488 +0000 UTC m=+213.211147959" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.699319 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" event={"ID":"cb1f930b-6743-4222-900e-c2442f33be13","Type":"ContainerStarted","Data":"7be4208f3a887f59a87b044c9043e56af8acac4d8ecfbdfc5b71f6dea0c754b6"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.700901 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:56 crc kubenswrapper[5008]: W0318 18:05:56.700977 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f135d3_9f42_4d08_bfd2_71aa61f5e01f.slice/crio-8216fee027d5a797a247c9874c123cc6342a9d483894ed930845af810d844216 WatchSource:0}: Error finding container 8216fee027d5a797a247c9874c123cc6342a9d483894ed930845af810d844216: Status 404 returned error can't find the container with id 8216fee027d5a797a247c9874c123cc6342a9d483894ed930845af810d844216 Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.738787 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" podStartSLOduration=161.738771523 podStartE2EDuration="2m41.738771523s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:56.736740962 +0000 UTC m=+213.256214041" watchObservedRunningTime="2026-03-18 18:05:56.738771523 +0000 UTC m=+213.258244602" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.743461 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.746051 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.246024761 +0000 UTC m=+213.765497890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.751761 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" event={"ID":"1d56a0c1-b18d-4e8c-acff-e57f000b3744","Type":"ContainerStarted","Data":"be3ac4f09fbf71b4deeea939d83e313d457a1953975d42f6d23f8a8dde8a1f73"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.769571 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" event={"ID":"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7","Type":"ContainerStarted","Data":"935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.770604 5008 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-z9ssp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body= Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.770635 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" podUID="1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.778865 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" podStartSLOduration=161.778851975 podStartE2EDuration="2m41.778851975s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:56.778526555 +0000 UTC m=+213.297999624" watchObservedRunningTime="2026-03-18 18:05:56.778851975 +0000 UTC m=+213.298325054" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.782768 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" event={"ID":"9630528a-7c8c-46cd-8bf4-e116f35b6911","Type":"ContainerStarted","Data":"32b792fa79ae03bc9f0b4d56e0e6bf5232970970f8d6819f3ffdb2787afe44a7"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.785639 5008 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-g9mpk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.785685 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" podUID="9630528a-7c8c-46cd-8bf4-e116f35b6911" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.804944 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5bfwn" event={"ID":"ade0f201-c317-40fc-bf82-293a53853ec4","Type":"ContainerStarted","Data":"bdf4470c690a0f3dd315b0e49927cc108bca71e156408283b3bc803565e1b144"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.823738 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5bfwn" podStartSLOduration=8.823717261 podStartE2EDuration="8.823717261s" podCreationTimestamp="2026-03-18 18:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:56.823103123 +0000 UTC m=+213.342576202" watchObservedRunningTime="2026-03-18 18:05:56.823717261 +0000 UTC m=+213.343190340" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.836969 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnsxl" event={"ID":"137dc523-385f-4afb-b972-66093e2e071e","Type":"ContainerStarted","Data":"bb2f5fda48570e774b87a469c8dec397fdcff287a47978e19b2e697549384eeb"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.848280 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.849897 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.349885546 +0000 UTC m=+213.869358745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.863990 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-crqfs" event={"ID":"951315e5-4217-41ba-8126-073fffe96d80","Type":"ContainerStarted","Data":"4a6e246ff82a0c465fc06ee2427d9e3fbba2fe92978c11d6cd3121055f96dee8"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.895300 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" event={"ID":"e2c07d05-3ff0-41e8-b792-7b349f553049","Type":"ContainerStarted","Data":"117d54c859c981464d829ddb4df28930aba579baa09cef11360ebc3008e733ad"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.895764 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" event={"ID":"e2c07d05-3ff0-41e8-b792-7b349f553049","Type":"ContainerStarted","Data":"6703ac49cbf03bf682e87d29cfd2e800e0f84c39279240ae950dfed9f657ff9e"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.895806 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.900542 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dx69r" event={"ID":"b4612145-a0ad-4f87-b1e8-9f17248900bd","Type":"ContainerStarted","Data":"d79eb46541e5cce82d85b21aac5f9974c1310c90ecd3b1afa9168ab596ec8d58"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.910406 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hmccb" event={"ID":"2c293129-16e9-4469-8307-f11ea13cd329","Type":"ContainerStarted","Data":"9a58f4eab540bb356e514dd97a56a92b0bf6fdf782361b7f2899b76c06cf233e"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.926693 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" event={"ID":"c7314b49-e434-4c49-babe-7ebc3925639a","Type":"ContainerStarted","Data":"7bf93c3c418e5c8e6845b55e4a2e6cf21dc2e2700955ddb8f903fb257bab3497"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.927528 5008 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tfg8c container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.927593 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.934417 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xth4j" event={"ID":"420d2432-4da0-4be3-8489-56c06a682e03","Type":"ContainerStarted","Data":"e2e936d049e1d78972d56c54c2a24e7ee96192ddf7a18ba76beb0e07359786b7"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.947457 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fln9j" event={"ID":"d548a7a8-a808-45d3-91be-b0e9242383ec","Type":"ContainerStarted","Data":"67a29946f2ec67e778e396ac154fff585ce8c10d144b941aaea87f29b903e5fb"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.954293 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:56 crc kubenswrapper[5008]: E0318 18:05:56.956004 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.455980468 +0000 UTC m=+213.975453547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.967369 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ltp2s" event={"ID":"eb6e9847-508d-42d9-b429-2567547e41fe","Type":"ContainerStarted","Data":"1776a4e6615d64163a444ee94c6519a8de127c1828e2dafbb6c57a2e02aa0651"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.967426 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ltp2s" event={"ID":"eb6e9847-508d-42d9-b429-2567547e41fe","Type":"ContainerStarted","Data":"5c94466cf6dba67eec5056a13af4615ef98fd8d03dcac24c43417e3546dd5bc2"} Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.968185 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-s5pml container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.968231 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-s5pml" podUID="31a94b93-89d6-4fab-87d7-05ecd80f55ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.981720 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" podStartSLOduration=161.981684159 podStartE2EDuration="2m41.981684159s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:56.92004212 +0000 UTC m=+213.439515199" watchObservedRunningTime="2026-03-18 18:05:56.981684159 +0000 UTC m=+213.501157238" Mar 18 18:05:56 crc kubenswrapper[5008]: I0318 18:05:56.981950 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.056533 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.059408 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.55939629 +0000 UTC m=+214.078869369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.068590 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ltp2s" podStartSLOduration=9.068566425 podStartE2EDuration="9.068566425s" podCreationTimestamp="2026-03-18 18:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:57.014130673 +0000 UTC m=+213.533603752" watchObservedRunningTime="2026-03-18 18:05:57.068566425 +0000 UTC m=+213.588039504" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.092414 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vrtz8"] Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.100756 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrtz8"] Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.100887 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.110694 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.157650 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.159622 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.659603156 +0000 UTC m=+214.179076235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.266889 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8rk\" (UniqueName: \"kubernetes.io/projected/5d041acc-48d2-4f2f-896f-94893b9ff41f-kube-api-access-tw8rk\") pod \"redhat-marketplace-vrtz8\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.267480 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-utilities\") pod \"redhat-marketplace-vrtz8\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.267541 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-catalog-content\") pod \"redhat-marketplace-vrtz8\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.267599 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.268046 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.768027128 +0000 UTC m=+214.287500207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.368738 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.369019 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.868976906 +0000 UTC m=+214.388449985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.369206 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8rk\" (UniqueName: \"kubernetes.io/projected/5d041acc-48d2-4f2f-896f-94893b9ff41f-kube-api-access-tw8rk\") pod \"redhat-marketplace-vrtz8\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.369321 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-utilities\") pod \"redhat-marketplace-vrtz8\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.369440 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-catalog-content\") pod \"redhat-marketplace-vrtz8\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.369489 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.369748 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-utilities\") pod \"redhat-marketplace-vrtz8\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.369798 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-catalog-content\") pod \"redhat-marketplace-vrtz8\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.369976 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.869940355 +0000 UTC m=+214.389413434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.403640 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8rk\" (UniqueName: \"kubernetes.io/projected/5d041acc-48d2-4f2f-896f-94893b9ff41f-kube-api-access-tw8rk\") pod \"redhat-marketplace-vrtz8\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.443774 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krh4d"] Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.444816 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.461898 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krh4d"] Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.470604 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.470980 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:57.970961585 +0000 UTC m=+214.490434654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.540917 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.564832 5008 patch_prober.go:28] interesting pod/router-default-5444994796-tvxdw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:05:57 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Mar 18 18:05:57 crc kubenswrapper[5008]: [+]process-running ok Mar 18 18:05:57 crc kubenswrapper[5008]: healthz check failed Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.565392 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tvxdw" podUID="7b10cc94-4769-4c11-a94d-7b01d8f228b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.573749 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.573820 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lp5w\" (UniqueName: \"kubernetes.io/projected/e03ee689-ed4f-4b64-9e4a-4d6febd71716-kube-api-access-4lp5w\") pod \"redhat-marketplace-krh4d\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.573862 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-catalog-content\") pod \"redhat-marketplace-krh4d\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.573895 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-utilities\") pod \"redhat-marketplace-krh4d\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.574289 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.074274404 +0000 UTC m=+214.593747483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.675057 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.675300 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.175253453 +0000 UTC m=+214.694726532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.675382 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-utilities\") pod \"redhat-marketplace-krh4d\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.675811 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.675883 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lp5w\" (UniqueName: \"kubernetes.io/projected/e03ee689-ed4f-4b64-9e4a-4d6febd71716-kube-api-access-4lp5w\") pod \"redhat-marketplace-krh4d\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.675963 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-catalog-content\") pod \"redhat-marketplace-krh4d\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.676313 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.176286714 +0000 UTC m=+214.695759983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.682976 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-utilities\") pod \"redhat-marketplace-krh4d\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.683097 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-catalog-content\") pod \"redhat-marketplace-krh4d\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.700581 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lp5w\" (UniqueName: \"kubernetes.io/projected/e03ee689-ed4f-4b64-9e4a-4d6febd71716-kube-api-access-4lp5w\") pod \"redhat-marketplace-krh4d\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.700737 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-ltp2s" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.774890 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.778971 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.779407 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.279386737 +0000 UTC m=+214.798859816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.885895 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.886349 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.386329635 +0000 UTC m=+214.905802714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.987450 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.987795 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.487756687 +0000 UTC m=+215.007229766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.987913 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:57 crc kubenswrapper[5008]: E0318 18:05:57.988191 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.48818053 +0000 UTC m=+215.007653609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.996972 5008 generic.go:334] "Generic (PLEG): container finished" podID="420d2432-4da0-4be3-8489-56c06a682e03" containerID="f691da955768cca288b0fa8f50decdd516c251746e0870a9ba18c7ff53724add" exitCode=0 Mar 18 18:05:57 crc kubenswrapper[5008]: I0318 18:05:57.997096 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xth4j" event={"ID":"420d2432-4da0-4be3-8489-56c06a682e03","Type":"ContainerDied","Data":"f691da955768cca288b0fa8f50decdd516c251746e0870a9ba18c7ff53724add"} Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.004616 5008 generic.go:334] "Generic (PLEG): container finished" podID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerID="36296ca70c048458db65ba5c18c43d2e22dd69b6a62e7e52a8060cb55e66d8c2" exitCode=0 Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.005064 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fln9j" event={"ID":"d548a7a8-a808-45d3-91be-b0e9242383ec","Type":"ContainerDied","Data":"36296ca70c048458db65ba5c18c43d2e22dd69b6a62e7e52a8060cb55e66d8c2"} Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.020796 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerID="87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518" exitCode=0 Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.020894 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2kl6" event={"ID":"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f","Type":"ContainerDied","Data":"87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518"} Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.020923 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2kl6" event={"ID":"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f","Type":"ContainerStarted","Data":"8216fee027d5a797a247c9874c123cc6342a9d483894ed930845af810d844216"} Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.045830 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5nhgk"] Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.045877 5008 generic.go:334] "Generic (PLEG): container finished" podID="9209339f-be23-444c-a635-04920e6a0cf6" containerID="ce484199224a95f43dfbc70cf47c7daa3b81ccff4004f4b966d10d8f745b60c7" exitCode=0 Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.059755 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b59s5" event={"ID":"9209339f-be23-444c-a635-04920e6a0cf6","Type":"ContainerDied","Data":"ce484199224a95f43dfbc70cf47c7daa3b81ccff4004f4b966d10d8f745b60c7"} Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.062024 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.071367 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.091514 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5nhgk"] Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.105996 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.106509 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rdtl\" (UniqueName: \"kubernetes.io/projected/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-kube-api-access-7rdtl\") pod \"redhat-operators-5nhgk\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.106536 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-utilities\") pod \"redhat-operators-5nhgk\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.106636 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-catalog-content\") pod \"redhat-operators-5nhgk\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.106770 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.606750186 +0000 UTC m=+215.126223265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.117832 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" event={"ID":"f3850a13-89c7-43a5-a58e-ad6fff6ba32f","Type":"ContainerStarted","Data":"48bfb717c61d5aef60480e9e39bb12072420ed18d93229b603684bf93ce4b612"} Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.142073 5008 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-l9dpq container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.142153 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" podUID="cb1f930b-6743-4222-900e-c2442f33be13" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.145369 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.209975 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rdtl\" (UniqueName: \"kubernetes.io/projected/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-kube-api-access-7rdtl\") pod \"redhat-operators-5nhgk\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.210078 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-utilities\") pod \"redhat-operators-5nhgk\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.210204 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.210300 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-catalog-content\") pod \"redhat-operators-5nhgk\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.231976 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.731950402 +0000 UTC m=+215.251423481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.234961 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-catalog-content\") pod \"redhat-operators-5nhgk\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.237135 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-utilities\") pod \"redhat-operators-5nhgk\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.285246 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rdtl\" (UniqueName: \"kubernetes.io/projected/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-kube-api-access-7rdtl\") pod \"redhat-operators-5nhgk\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.309633 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.367288 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.367891 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.867873679 +0000 UTC m=+215.387346758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.371621 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" podStartSLOduration=163.371605321 podStartE2EDuration="2m43.371605321s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:05:58.371143047 +0000 UTC m=+214.890616126" watchObservedRunningTime="2026-03-18 18:05:58.371605321 +0000 UTC m=+214.891078400" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.385786 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrtz8"] Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.447041 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.460774 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dvkpk"] Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.462332 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.468540 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.468954 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:58.968941341 +0000 UTC m=+215.488414420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.484368 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33774: no serving certificate available for the kubelet" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.496923 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvkpk"] Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.498188 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-g9mpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.527609 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krh4d"] Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.568769 5008 patch_prober.go:28] interesting pod/router-default-5444994796-tvxdw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:05:58 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Mar 18 18:05:58 crc kubenswrapper[5008]: [+]process-running ok Mar 18 18:05:58 crc kubenswrapper[5008]: healthz check failed Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.568822 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tvxdw" podUID="7b10cc94-4769-4c11-a94d-7b01d8f228b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.569390 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.569634 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-catalog-content\") pod \"redhat-operators-dvkpk\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.569694 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-utilities\") pod \"redhat-operators-dvkpk\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.569767 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj4hb\" (UniqueName: \"kubernetes.io/projected/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-kube-api-access-jj4hb\") pod \"redhat-operators-dvkpk\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.569853 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.069839007 +0000 UTC m=+215.589312086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.630094 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33776: no serving certificate available for the kubelet" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.670585 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj4hb\" (UniqueName: \"kubernetes.io/projected/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-kube-api-access-jj4hb\") pod \"redhat-operators-dvkpk\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.670629 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-catalog-content\") pod \"redhat-operators-dvkpk\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.675228 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.675319 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-utilities\") pod \"redhat-operators-dvkpk\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.675912 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-utilities\") pod \"redhat-operators-dvkpk\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.676175 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-catalog-content\") pod \"redhat-operators-dvkpk\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.676465 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.176451815 +0000 UTC m=+215.695924894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.722315 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj4hb\" (UniqueName: \"kubernetes.io/projected/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-kube-api-access-jj4hb\") pod \"redhat-operators-dvkpk\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.767943 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33784: no serving certificate available for the kubelet" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.771860 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-s5m7q"] Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.776260 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z"] Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.776457 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" podUID="225c4962-d9d2-4d32-85de-51872521d9a3" containerName="route-controller-manager" containerID="cri-o://5a08372f8a3885b21bd1c9e871b024a355dce67eb58bd73e324f9677d8f43444" gracePeriod=30 Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.781936 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.784018 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.283991821 +0000 UTC m=+215.803464890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.785537 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.787107 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.287094794 +0000 UTC m=+215.806567873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.829185 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.891075 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33796: no serving certificate available for the kubelet" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.891668 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.892008 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.39199327 +0000 UTC m=+215.911466349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.986591 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33812: no serving certificate available for the kubelet" Mar 18 18:05:58 crc kubenswrapper[5008]: I0318 18:05:58.994347 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:58 crc kubenswrapper[5008]: E0318 18:05:58.994742 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.494729852 +0000 UTC m=+216.014202931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.097588 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.098432 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.598417382 +0000 UTC m=+216.117890461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.110071 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5nhgk"] Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.159187 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krh4d" event={"ID":"e03ee689-ed4f-4b64-9e4a-4d6febd71716","Type":"ContainerStarted","Data":"54db9b507be2946e47cd81a9088947508d9f7c261e6964921e8dba990e27b6fe"} Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.159237 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krh4d" event={"ID":"e03ee689-ed4f-4b64-9e4a-4d6febd71716","Type":"ContainerStarted","Data":"6b463eb8cfae6531761f5c8ae9a01acfe7205f06dcfc7d76474c7fb3772d3094"} Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.184499 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33828: no serving certificate available for the kubelet" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.193485 5008 generic.go:334] "Generic (PLEG): container finished" podID="225c4962-d9d2-4d32-85de-51872521d9a3" containerID="5a08372f8a3885b21bd1c9e871b024a355dce67eb58bd73e324f9677d8f43444" exitCode=0 Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.193588 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" event={"ID":"225c4962-d9d2-4d32-85de-51872521d9a3","Type":"ContainerDied","Data":"5a08372f8a3885b21bd1c9e871b024a355dce67eb58bd73e324f9677d8f43444"} Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.198028 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrtz8" event={"ID":"5d041acc-48d2-4f2f-896f-94893b9ff41f","Type":"ContainerStarted","Data":"331eb53ae51d183bc66d4665a6c1f8cf06d2165248a2e1d63cd24522cc621a29"} Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.198813 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.199128 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.699117842 +0000 UTC m=+216.218590921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.199877 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" podUID="d5bc82db-9313-414e-aa86-ff630456fb49" containerName="controller-manager" containerID="cri-o://748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4" gracePeriod=30 Mar 18 18:05:59 crc kubenswrapper[5008]: W0318 18:05:59.211734 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a5d0987_3c0d_4e5d_959c_65e7dbacb6e3.slice/crio-ab3a12592bf5b5f4602c21176c74e03f52f1aea33b7702cb106ff77b80b513a6 WatchSource:0}: Error finding container ab3a12592bf5b5f4602c21176c74e03f52f1aea33b7702cb106ff77b80b513a6: Status 404 returned error can't find the container with id ab3a12592bf5b5f4602c21176c74e03f52f1aea33b7702cb106ff77b80b513a6 Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.299432 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.300717 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.800702909 +0000 UTC m=+216.320175988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.403171 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.403407 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33844: no serving certificate available for the kubelet" Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.404744 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:05:59.904715829 +0000 UTC m=+216.424188908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.487917 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l9dpq" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.511642 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.512021 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.012006267 +0000 UTC m=+216.531479356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.557341 5008 patch_prober.go:28] interesting pod/router-default-5444994796-tvxdw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:05:59 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Mar 18 18:05:59 crc kubenswrapper[5008]: [+]process-running ok Mar 18 18:05:59 crc kubenswrapper[5008]: healthz check failed Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.557429 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tvxdw" podUID="7b10cc94-4769-4c11-a94d-7b01d8f228b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.613544 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.614015 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.113997906 +0000 UTC m=+216.633470975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.625024 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.717314 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-config\") pod \"225c4962-d9d2-4d32-85de-51872521d9a3\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.718052 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.718105 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225c4962-d9d2-4d32-85de-51872521d9a3-serving-cert\") pod \"225c4962-d9d2-4d32-85de-51872521d9a3\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.718179 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5stq\" (UniqueName: \"kubernetes.io/projected/225c4962-d9d2-4d32-85de-51872521d9a3-kube-api-access-s5stq\") pod \"225c4962-d9d2-4d32-85de-51872521d9a3\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.718201 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-client-ca\") pod \"225c4962-d9d2-4d32-85de-51872521d9a3\" (UID: \"225c4962-d9d2-4d32-85de-51872521d9a3\") " Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.718508 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-config" (OuterVolumeSpecName: "config") pod "225c4962-d9d2-4d32-85de-51872521d9a3" (UID: "225c4962-d9d2-4d32-85de-51872521d9a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.718959 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.718974 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-client-ca" (OuterVolumeSpecName: "client-ca") pod "225c4962-d9d2-4d32-85de-51872521d9a3" (UID: "225c4962-d9d2-4d32-85de-51872521d9a3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.721780 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.221749278 +0000 UTC m=+216.741222357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.728757 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225c4962-d9d2-4d32-85de-51872521d9a3-kube-api-access-s5stq" (OuterVolumeSpecName: "kube-api-access-s5stq") pod "225c4962-d9d2-4d32-85de-51872521d9a3" (UID: "225c4962-d9d2-4d32-85de-51872521d9a3"). InnerVolumeSpecName "kube-api-access-s5stq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.733042 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/225c4962-d9d2-4d32-85de-51872521d9a3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "225c4962-d9d2-4d32-85de-51872521d9a3" (UID: "225c4962-d9d2-4d32-85de-51872521d9a3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.794502 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33856: no serving certificate available for the kubelet" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.825464 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.825585 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225c4962-d9d2-4d32-85de-51872521d9a3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.825596 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5stq\" (UniqueName: \"kubernetes.io/projected/225c4962-d9d2-4d32-85de-51872521d9a3-kube-api-access-s5stq\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.825605 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/225c4962-d9d2-4d32-85de-51872521d9a3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.825841 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.32582951 +0000 UTC m=+216.845302579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.881306 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvkpk"] Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.915748 5008 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.926984 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.927182 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.42715247 +0000 UTC m=+216.946625559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.927245 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:05:59 crc kubenswrapper[5008]: E0318 18:05:59.927682 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.427672065 +0000 UTC m=+216.947145144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:05:59 crc kubenswrapper[5008]: I0318 18:05:59.981102 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.028284 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:06:00 crc kubenswrapper[5008]: E0318 18:06:00.028750 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.528734796 +0000 UTC m=+217.048207865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.129757 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm9gh\" (UniqueName: \"kubernetes.io/projected/d5bc82db-9313-414e-aa86-ff630456fb49-kube-api-access-wm9gh\") pod \"d5bc82db-9313-414e-aa86-ff630456fb49\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.129852 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5bc82db-9313-414e-aa86-ff630456fb49-serving-cert\") pod \"d5bc82db-9313-414e-aa86-ff630456fb49\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.129884 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-client-ca\") pod \"d5bc82db-9313-414e-aa86-ff630456fb49\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.129927 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-config\") pod \"d5bc82db-9313-414e-aa86-ff630456fb49\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.129971 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-proxy-ca-bundles\") pod \"d5bc82db-9313-414e-aa86-ff630456fb49\" (UID: \"d5bc82db-9313-414e-aa86-ff630456fb49\") " Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.130215 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:06:00 crc kubenswrapper[5008]: E0318 18:06:00.130571 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.63054339 +0000 UTC m=+217.150016459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.131307 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d5bc82db-9313-414e-aa86-ff630456fb49" (UID: "d5bc82db-9313-414e-aa86-ff630456fb49"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.131383 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-client-ca" (OuterVolumeSpecName: "client-ca") pod "d5bc82db-9313-414e-aa86-ff630456fb49" (UID: "d5bc82db-9313-414e-aa86-ff630456fb49"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.134738 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-config" (OuterVolumeSpecName: "config") pod "d5bc82db-9313-414e-aa86-ff630456fb49" (UID: "d5bc82db-9313-414e-aa86-ff630456fb49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.134856 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564286-cfxqp"] Mar 18 18:06:00 crc kubenswrapper[5008]: E0318 18:06:00.136225 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bc82db-9313-414e-aa86-ff630456fb49" containerName="controller-manager" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.136247 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bc82db-9313-414e-aa86-ff630456fb49" containerName="controller-manager" Mar 18 18:06:00 crc kubenswrapper[5008]: E0318 18:06:00.136296 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225c4962-d9d2-4d32-85de-51872521d9a3" containerName="route-controller-manager" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.136308 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="225c4962-d9d2-4d32-85de-51872521d9a3" containerName="route-controller-manager" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.136421 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="225c4962-d9d2-4d32-85de-51872521d9a3" containerName="route-controller-manager" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.136436 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bc82db-9313-414e-aa86-ff630456fb49" containerName="controller-manager" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.136868 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.137017 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5bc82db-9313-414e-aa86-ff630456fb49-kube-api-access-wm9gh" (OuterVolumeSpecName: "kube-api-access-wm9gh") pod "d5bc82db-9313-414e-aa86-ff630456fb49" (UID: "d5bc82db-9313-414e-aa86-ff630456fb49"). InnerVolumeSpecName "kube-api-access-wm9gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.137347 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5bc82db-9313-414e-aa86-ff630456fb49-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d5bc82db-9313-414e-aa86-ff630456fb49" (UID: "d5bc82db-9313-414e-aa86-ff630456fb49"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.141288 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.147422 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-cfxqp"] Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.231449 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:06:00 crc kubenswrapper[5008]: E0318 18:06:00.231676 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.731633893 +0000 UTC m=+217.251106972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.231723 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.231940 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cm9g\" (UniqueName: \"kubernetes.io/projected/5424805e-15fc-4424-8942-93f7095e148b-kube-api-access-2cm9g\") pod \"auto-csr-approver-29564286-cfxqp\" (UID: \"5424805e-15fc-4424-8942-93f7095e148b\") " pod="openshift-infra/auto-csr-approver-29564286-cfxqp" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.232191 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm9gh\" (UniqueName: \"kubernetes.io/projected/d5bc82db-9313-414e-aa86-ff630456fb49-kube-api-access-wm9gh\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:00 crc kubenswrapper[5008]: E0318 18:06:00.232197 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 18:06:00.732185579 +0000 UTC m=+217.251658658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5gw26" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.232227 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5bc82db-9313-414e-aa86-ff630456fb49-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.232239 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.232248 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.232258 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5bc82db-9313-414e-aa86-ff630456fb49-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.262123 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" event={"ID":"f3850a13-89c7-43a5-a58e-ad6fff6ba32f","Type":"ContainerStarted","Data":"4da32a35053a8e2187c94bffe5a83ab6f468fe237c1c5d811486fee1f25db366"} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.267672 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvkpk" event={"ID":"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6","Type":"ContainerStarted","Data":"6e0c1cf05197aed9ab817cc6eee791becd2bb8bde27ab4b09de7f73f3fad6d08"} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.273117 5008 generic.go:334] "Generic (PLEG): container finished" podID="d5bc82db-9313-414e-aa86-ff630456fb49" containerID="748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4" exitCode=0 Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.273188 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" event={"ID":"d5bc82db-9313-414e-aa86-ff630456fb49","Type":"ContainerDied","Data":"748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4"} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.273231 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" event={"ID":"d5bc82db-9313-414e-aa86-ff630456fb49","Type":"ContainerDied","Data":"1638a939c18db509f8e83a3064ddd6e98dd75e07ac74865acc82c339201f5648"} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.273250 5008 scope.go:117] "RemoveContainer" containerID="748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.273387 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-s5m7q" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.283518 5008 generic.go:334] "Generic (PLEG): container finished" podID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerID="54db9b507be2946e47cd81a9088947508d9f7c261e6964921e8dba990e27b6fe" exitCode=0 Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.283638 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krh4d" event={"ID":"e03ee689-ed4f-4b64-9e4a-4d6febd71716","Type":"ContainerDied","Data":"54db9b507be2946e47cd81a9088947508d9f7c261e6964921e8dba990e27b6fe"} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.295679 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-s5m7q"] Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.296220 5008 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T18:05:59.915776878Z","Handler":null,"Name":""} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.304697 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.305336 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z" event={"ID":"225c4962-d9d2-4d32-85de-51872521d9a3","Type":"ContainerDied","Data":"d14784e67b968db0179cb4f6a40a3f0b6a70dc0140ac8993cbd6f427176d925c"} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.311814 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-s5m7q"] Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.314054 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-556dc587d9-jflz2"] Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.315106 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.316139 5008 generic.go:334] "Generic (PLEG): container finished" podID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerID="8faaf40a7bbc945d5b2e99f0545c58c4cea99f59920c163c286bfdd46270980d" exitCode=0 Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.316197 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrtz8" event={"ID":"5d041acc-48d2-4f2f-896f-94893b9ff41f","Type":"ContainerDied","Data":"8faaf40a7bbc945d5b2e99f0545c58c4cea99f59920c163c286bfdd46270980d"} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.327649 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.327880 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.328217 5008 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.328247 5008 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.328382 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.328835 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.329192 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.329410 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.329443 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.332833 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm"] Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.350146 5008 generic.go:334] "Generic (PLEG): container finished" podID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerID="7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372" exitCode=0 Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.364567 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.368023 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cm9g\" (UniqueName: \"kubernetes.io/projected/5424805e-15fc-4424-8942-93f7095e148b-kube-api-access-2cm9g\") pod \"auto-csr-approver-29564286-cfxqp\" (UID: \"5424805e-15fc-4424-8942-93f7095e148b\") " pod="openshift-infra/auto-csr-approver-29564286-cfxqp" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.382974 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.391845 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cm9g\" (UniqueName: \"kubernetes.io/projected/5424805e-15fc-4424-8942-93f7095e148b-kube-api-access-2cm9g\") pod \"auto-csr-approver-29564286-cfxqp\" (UID: \"5424805e-15fc-4424-8942-93f7095e148b\") " pod="openshift-infra/auto-csr-approver-29564286-cfxqp" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.393103 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-556dc587d9-jflz2"] Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.393160 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nhgk" event={"ID":"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3","Type":"ContainerDied","Data":"7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372"} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.393225 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nhgk" event={"ID":"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3","Type":"ContainerStarted","Data":"ab3a12592bf5b5f4602c21176c74e03f52f1aea33b7702cb106ff77b80b513a6"} Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.393545 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.399064 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm"] Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.399402 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.401641 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.401830 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.402012 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.402221 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.402261 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.415224 5008 scope.go:117] "RemoveContainer" containerID="748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4" Mar 18 18:06:00 crc kubenswrapper[5008]: E0318 18:06:00.416400 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4\": container with ID starting with 748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4 not found: ID does not exist" containerID="748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.416459 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4"} err="failed to get container status \"748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4\": rpc error: code = NotFound desc = could not find container \"748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4\": container with ID starting with 748b34efeb645c1743e3fe2088f5746a29ca7f6ed884257aed9e893f17be86c4 not found: ID does not exist" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.416483 5008 scope.go:117] "RemoveContainer" containerID="5a08372f8a3885b21bd1c9e871b024a355dce67eb58bd73e324f9677d8f43444" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.431736 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z"] Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.435478 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dp77z"] Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.476136 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33866: no serving certificate available for the kubelet" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.482854 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-client-ca\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.482931 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.483009 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kj2\" (UniqueName: \"kubernetes.io/projected/69b8e1e4-924d-476d-9479-b09d357092ae-kube-api-access-x8kj2\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.483072 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-config\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.483105 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-proxy-ca-bundles\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.500206 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.500813 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69b8e1e4-924d-476d-9479-b09d357092ae-serving-cert\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.505214 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.505249 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.548184 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5gw26\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.550859 5008 patch_prober.go:28] interesting pod/router-default-5444994796-tvxdw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:06:00 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Mar 18 18:06:00 crc kubenswrapper[5008]: [+]process-running ok Mar 18 18:06:00 crc kubenswrapper[5008]: healthz check failed Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.550932 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tvxdw" podUID="7b10cc94-4769-4c11-a94d-7b01d8f228b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.603010 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-proxy-ca-bundles\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.603101 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb5th\" (UniqueName: \"kubernetes.io/projected/a2a30953-fe71-40f7-86f5-9759485c1954-kube-api-access-mb5th\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.603122 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69b8e1e4-924d-476d-9479-b09d357092ae-serving-cert\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.603146 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-client-ca\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.603207 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-client-ca\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.603245 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kj2\" (UniqueName: \"kubernetes.io/projected/69b8e1e4-924d-476d-9479-b09d357092ae-kube-api-access-x8kj2\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.603261 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2a30953-fe71-40f7-86f5-9759485c1954-serving-cert\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.603287 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-config\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.603305 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-config\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.604531 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-proxy-ca-bundles\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.604686 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-config\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.606864 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-client-ca\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.608500 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69b8e1e4-924d-476d-9479-b09d357092ae-serving-cert\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.621683 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kj2\" (UniqueName: \"kubernetes.io/projected/69b8e1e4-924d-476d-9479-b09d357092ae-kube-api-access-x8kj2\") pod \"controller-manager-556dc587d9-jflz2\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.663392 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.663436 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.670412 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.684471 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.705025 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-config\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.705166 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb5th\" (UniqueName: \"kubernetes.io/projected/a2a30953-fe71-40f7-86f5-9759485c1954-kube-api-access-mb5th\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.705209 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-client-ca\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.705342 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2a30953-fe71-40f7-86f5-9759485c1954-serving-cert\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.708592 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-client-ca\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.708713 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-config\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.711016 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2a30953-fe71-40f7-86f5-9759485c1954-serving-cert\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.721861 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb5th\" (UniqueName: \"kubernetes.io/projected/a2a30953-fe71-40f7-86f5-9759485c1954-kube-api-access-mb5th\") pod \"route-controller-manager-7d6b58f489-5lfcm\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.728528 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.827948 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:06:00 crc kubenswrapper[5008]: I0318 18:06:00.971383 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-cfxqp"] Mar 18 18:06:01 crc kubenswrapper[5008]: W0318 18:06:01.033528 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5424805e_15fc_4424_8942_93f7095e148b.slice/crio-b07a05049815bff95951405e533e0ac9c6108bf2240fca351740f0af62e58dd6 WatchSource:0}: Error finding container b07a05049815bff95951405e533e0ac9c6108bf2240fca351740f0af62e58dd6: Status 404 returned error can't find the container with id b07a05049815bff95951405e533e0ac9c6108bf2240fca351740f0af62e58dd6 Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.059829 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.060461 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.061650 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-s5pml container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.061692 5008 patch_prober.go:28] interesting pod/downloads-7954f5f757-s5pml container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.061696 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-s5pml" podUID="31a94b93-89d6-4fab-87d7-05ecd80f55ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.061750 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-s5pml" podUID="31a94b93-89d6-4fab-87d7-05ecd80f55ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.065133 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.065816 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.070795 5008 patch_prober.go:28] interesting pod/console-f9d7485db-gmczr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.070831 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gmczr" podUID="0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.071934 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.073698 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.092860 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.175340 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm"] Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.212154 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-556dc587d9-jflz2"] Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.212796 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8bd84e51-ab6a-4274-8245-3f13231fe267-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8bd84e51-ab6a-4274-8245-3f13231fe267\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.212893 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8bd84e51-ab6a-4274-8245-3f13231fe267-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8bd84e51-ab6a-4274-8245-3f13231fe267\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.300433 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5gw26"] Mar 18 18:06:01 crc kubenswrapper[5008]: W0318 18:06:01.308803 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b8e1e4_924d_476d_9479_b09d357092ae.slice/crio-7b6cb8a9529db54694718d487561936322dbd03a17b9f32786be56910bd7de77 WatchSource:0}: Error finding container 7b6cb8a9529db54694718d487561936322dbd03a17b9f32786be56910bd7de77: Status 404 returned error can't find the container with id 7b6cb8a9529db54694718d487561936322dbd03a17b9f32786be56910bd7de77 Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.314664 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8bd84e51-ab6a-4274-8245-3f13231fe267-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8bd84e51-ab6a-4274-8245-3f13231fe267\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.314710 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8bd84e51-ab6a-4274-8245-3f13231fe267-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8bd84e51-ab6a-4274-8245-3f13231fe267\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.314774 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8bd84e51-ab6a-4274-8245-3f13231fe267-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8bd84e51-ab6a-4274-8245-3f13231fe267\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.335218 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8bd84e51-ab6a-4274-8245-3f13231fe267-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8bd84e51-ab6a-4274-8245-3f13231fe267\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.380376 5008 generic.go:334] "Generic (PLEG): container finished" podID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerID="8e9604c39970d34cca722cffe1a698fa864ecfdb63481c222b935e2449f77f39" exitCode=0 Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.380674 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvkpk" event={"ID":"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6","Type":"ContainerDied","Data":"8e9604c39970d34cca722cffe1a698fa864ecfdb63481c222b935e2449f77f39"} Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.383593 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" event={"ID":"d02d52ba-4ba4-47b2-b0f3-a769e009d161","Type":"ContainerStarted","Data":"61b317bd85bab92a566e47702b9e8701e59d2dfc19c5692c2aa133d9eb047fdc"} Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.387871 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" event={"ID":"a2a30953-fe71-40f7-86f5-9759485c1954","Type":"ContainerStarted","Data":"504cf8b171aa29c9dfbc8d950b63cb25317327a06978f12ee6a36ea17cbf9cfd"} Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.397231 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" event={"ID":"5424805e-15fc-4424-8942-93f7095e148b","Type":"ContainerStarted","Data":"b07a05049815bff95951405e533e0ac9c6108bf2240fca351740f0af62e58dd6"} Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.404600 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" event={"ID":"69b8e1e4-924d-476d-9479-b09d357092ae","Type":"ContainerStarted","Data":"7b6cb8a9529db54694718d487561936322dbd03a17b9f32786be56910bd7de77"} Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.404893 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.413960 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" event={"ID":"f3850a13-89c7-43a5-a58e-ad6fff6ba32f","Type":"ContainerStarted","Data":"73d30f2d9a8d96888a34d98e1eac8883557b9ca75f41dd7e9933309cb5c35414"} Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.449748 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.449839 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.459653 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bprvc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.460961 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.547808 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.575344 5008 patch_prober.go:28] interesting pod/router-default-5444994796-tvxdw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:06:01 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Mar 18 18:06:01 crc kubenswrapper[5008]: [+]process-running ok Mar 18 18:06:01 crc kubenswrapper[5008]: healthz check failed Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.575393 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tvxdw" podUID="7b10cc94-4769-4c11-a94d-7b01d8f228b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.723546 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.725642 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb513699-bfbf-4ce3-8034-64633f18abf5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eb513699-bfbf-4ce3-8034-64633f18abf5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.725710 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb513699-bfbf-4ce3-8034-64633f18abf5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eb513699-bfbf-4ce3-8034-64633f18abf5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.725939 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.729450 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.729601 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.733105 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.809376 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33874: no serving certificate available for the kubelet" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.829451 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb513699-bfbf-4ce3-8034-64633f18abf5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eb513699-bfbf-4ce3-8034-64633f18abf5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.829571 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb513699-bfbf-4ce3-8034-64633f18abf5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eb513699-bfbf-4ce3-8034-64633f18abf5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.829675 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb513699-bfbf-4ce3-8034-64633f18abf5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eb513699-bfbf-4ce3-8034-64633f18abf5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.872952 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb513699-bfbf-4ce3-8034-64633f18abf5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eb513699-bfbf-4ce3-8034-64633f18abf5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:01 crc kubenswrapper[5008]: I0318 18:06:01.987752 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.082619 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.272128 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="225c4962-d9d2-4d32-85de-51872521d9a3" path="/var/lib/kubelet/pods/225c4962-d9d2-4d32-85de-51872521d9a3/volumes" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.273117 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.273807 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bc82db-9313-414e-aa86-ff630456fb49" path="/var/lib/kubelet/pods/d5bc82db-9313-414e-aa86-ff630456fb49/volumes" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.497776 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8bd84e51-ab6a-4274-8245-3f13231fe267","Type":"ContainerStarted","Data":"b29be4ecaae32bcf746fe225a1541a9f3115bd131d8963df8729d56a02c789fe"} Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.511172 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" event={"ID":"69b8e1e4-924d-476d-9479-b09d357092ae","Type":"ContainerStarted","Data":"9905e140ab69d1f6cc79813a2287b4fb22489b0c258fe3e5d7b4e8b12705bf88"} Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.512248 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.540581 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" podStartSLOduration=3.54055439 podStartE2EDuration="3.54055439s" podCreationTimestamp="2026-03-18 18:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:02.538394325 +0000 UTC m=+219.057867404" watchObservedRunningTime="2026-03-18 18:06:02.54055439 +0000 UTC m=+219.060027469" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.545179 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.550347 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" event={"ID":"f3850a13-89c7-43a5-a58e-ad6fff6ba32f","Type":"ContainerStarted","Data":"e0f97f395da3aec7fe603aef19d21c720e59bc4bbe59587b2db0f8182fe14f10"} Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.556284 5008 patch_prober.go:28] interesting pod/router-default-5444994796-tvxdw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:06:02 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Mar 18 18:06:02 crc kubenswrapper[5008]: [+]process-running ok Mar 18 18:06:02 crc kubenswrapper[5008]: healthz check failed Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.556343 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tvxdw" podUID="7b10cc94-4769-4c11-a94d-7b01d8f228b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.564348 5008 generic.go:334] "Generic (PLEG): container finished" podID="860a9876-b8f6-4125-bd1c-51518eb10283" containerID="a6d520fda5cf8a6f5d8092a6215019730978014336857c00e8726c3623486cd0" exitCode=0 Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.564466 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" event={"ID":"860a9876-b8f6-4125-bd1c-51518eb10283","Type":"ContainerDied","Data":"a6d520fda5cf8a6f5d8092a6215019730978014336857c00e8726c3623486cd0"} Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.604970 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" event={"ID":"d02d52ba-4ba4-47b2-b0f3-a769e009d161","Type":"ContainerStarted","Data":"6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189"} Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.605976 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.607023 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2w6x4" podStartSLOduration=14.606997603 podStartE2EDuration="14.606997603s" podCreationTimestamp="2026-03-18 18:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:02.605956242 +0000 UTC m=+219.125429331" watchObservedRunningTime="2026-03-18 18:06:02.606997603 +0000 UTC m=+219.126470682" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.639769 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" event={"ID":"a2a30953-fe71-40f7-86f5-9759485c1954","Type":"ContainerStarted","Data":"83fa7069f998c98d03d98225ae3a00099c610d9e543ed21be2135d3f3c01f173"} Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.653248 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.666540 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hrvzq" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.694471 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.759834 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" podStartSLOduration=167.759811547 podStartE2EDuration="2m47.759811547s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:02.687478077 +0000 UTC m=+219.206951156" watchObservedRunningTime="2026-03-18 18:06:02.759811547 +0000 UTC m=+219.279284626" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.782319 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" podStartSLOduration=3.78229796 podStartE2EDuration="3.78229796s" podCreationTimestamp="2026-03-18 18:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:02.781992091 +0000 UTC m=+219.301465170" watchObservedRunningTime="2026-03-18 18:06:02.78229796 +0000 UTC m=+219.301771039" Mar 18 18:06:02 crc kubenswrapper[5008]: I0318 18:06:02.843829 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 18:06:03 crc kubenswrapper[5008]: I0318 18:06:03.551776 5008 patch_prober.go:28] interesting pod/router-default-5444994796-tvxdw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 18:06:03 crc kubenswrapper[5008]: [-]has-synced failed: reason withheld Mar 18 18:06:03 crc kubenswrapper[5008]: [+]process-running ok Mar 18 18:06:03 crc kubenswrapper[5008]: healthz check failed Mar 18 18:06:03 crc kubenswrapper[5008]: I0318 18:06:03.551833 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tvxdw" podUID="7b10cc94-4769-4c11-a94d-7b01d8f228b1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 18:06:03 crc kubenswrapper[5008]: I0318 18:06:03.680987 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eb513699-bfbf-4ce3-8034-64633f18abf5","Type":"ContainerStarted","Data":"7e5b855572bd1cb372e5d2a928ec7b60378f27d92d82c901ef01cd22c5cedf54"} Mar 18 18:06:03 crc kubenswrapper[5008]: I0318 18:06:03.696312 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8bd84e51-ab6a-4274-8245-3f13231fe267","Type":"ContainerStarted","Data":"ff0ec8aa6bdfc29fe5712e20596ac40aa4eb70e5a8f34328a2ef5ead21e43ecd"} Mar 18 18:06:03 crc kubenswrapper[5008]: I0318 18:06:03.722615 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.722599915 podStartE2EDuration="2.722599915s" podCreationTimestamp="2026-03-18 18:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:03.71975563 +0000 UTC m=+220.239228709" watchObservedRunningTime="2026-03-18 18:06:03.722599915 +0000 UTC m=+220.242072994" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.404483 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33878: no serving certificate available for the kubelet" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.450254 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.519325 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c8f2\" (UniqueName: \"kubernetes.io/projected/860a9876-b8f6-4125-bd1c-51518eb10283-kube-api-access-2c8f2\") pod \"860a9876-b8f6-4125-bd1c-51518eb10283\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.519457 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860a9876-b8f6-4125-bd1c-51518eb10283-config-volume\") pod \"860a9876-b8f6-4125-bd1c-51518eb10283\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.519522 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860a9876-b8f6-4125-bd1c-51518eb10283-secret-volume\") pod \"860a9876-b8f6-4125-bd1c-51518eb10283\" (UID: \"860a9876-b8f6-4125-bd1c-51518eb10283\") " Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.521443 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860a9876-b8f6-4125-bd1c-51518eb10283-config-volume" (OuterVolumeSpecName: "config-volume") pod "860a9876-b8f6-4125-bd1c-51518eb10283" (UID: "860a9876-b8f6-4125-bd1c-51518eb10283"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.548426 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860a9876-b8f6-4125-bd1c-51518eb10283-kube-api-access-2c8f2" (OuterVolumeSpecName: "kube-api-access-2c8f2") pod "860a9876-b8f6-4125-bd1c-51518eb10283" (UID: "860a9876-b8f6-4125-bd1c-51518eb10283"). InnerVolumeSpecName "kube-api-access-2c8f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.549827 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.554819 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860a9876-b8f6-4125-bd1c-51518eb10283-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "860a9876-b8f6-4125-bd1c-51518eb10283" (UID: "860a9876-b8f6-4125-bd1c-51518eb10283"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.561226 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tvxdw" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.624892 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c8f2\" (UniqueName: \"kubernetes.io/projected/860a9876-b8f6-4125-bd1c-51518eb10283-kube-api-access-2c8f2\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.625188 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860a9876-b8f6-4125-bd1c-51518eb10283-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.625271 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860a9876-b8f6-4125-bd1c-51518eb10283-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.733774 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.733767 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl" event={"ID":"860a9876-b8f6-4125-bd1c-51518eb10283","Type":"ContainerDied","Data":"290c0c01946447e2054a22a470a448e5c3c44fc4a6d379c060406fc0135062e3"} Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.733867 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="290c0c01946447e2054a22a470a448e5c3c44fc4a6d379c060406fc0135062e3" Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.749581 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eb513699-bfbf-4ce3-8034-64633f18abf5","Type":"ContainerStarted","Data":"051c8879c6ea391dd8f96a48315b5135dbe7ce968c186b4372928c47ac3d999e"} Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.758144 5008 generic.go:334] "Generic (PLEG): container finished" podID="8bd84e51-ab6a-4274-8245-3f13231fe267" containerID="ff0ec8aa6bdfc29fe5712e20596ac40aa4eb70e5a8f34328a2ef5ead21e43ecd" exitCode=0 Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.758996 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8bd84e51-ab6a-4274-8245-3f13231fe267","Type":"ContainerDied","Data":"ff0ec8aa6bdfc29fe5712e20596ac40aa4eb70e5a8f34328a2ef5ead21e43ecd"} Mar 18 18:06:04 crc kubenswrapper[5008]: I0318 18:06:04.775611 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.77559405 podStartE2EDuration="3.77559405s" podCreationTimestamp="2026-03-18 18:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:04.772628191 +0000 UTC m=+221.292101270" watchObservedRunningTime="2026-03-18 18:06:04.77559405 +0000 UTC m=+221.295067129" Mar 18 18:06:05 crc kubenswrapper[5008]: I0318 18:06:05.786862 5008 generic.go:334] "Generic (PLEG): container finished" podID="eb513699-bfbf-4ce3-8034-64633f18abf5" containerID="051c8879c6ea391dd8f96a48315b5135dbe7ce968c186b4372928c47ac3d999e" exitCode=0 Mar 18 18:06:05 crc kubenswrapper[5008]: I0318 18:06:05.787092 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eb513699-bfbf-4ce3-8034-64633f18abf5","Type":"ContainerDied","Data":"051c8879c6ea391dd8f96a48315b5135dbe7ce968c186b4372928c47ac3d999e"} Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.195061 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.264409 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8bd84e51-ab6a-4274-8245-3f13231fe267-kubelet-dir\") pod \"8bd84e51-ab6a-4274-8245-3f13231fe267\" (UID: \"8bd84e51-ab6a-4274-8245-3f13231fe267\") " Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.264483 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8bd84e51-ab6a-4274-8245-3f13231fe267-kube-api-access\") pod \"8bd84e51-ab6a-4274-8245-3f13231fe267\" (UID: \"8bd84e51-ab6a-4274-8245-3f13231fe267\") " Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.265829 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bd84e51-ab6a-4274-8245-3f13231fe267-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8bd84e51-ab6a-4274-8245-3f13231fe267" (UID: "8bd84e51-ab6a-4274-8245-3f13231fe267"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.283626 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd84e51-ab6a-4274-8245-3f13231fe267-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8bd84e51-ab6a-4274-8245-3f13231fe267" (UID: "8bd84e51-ab6a-4274-8245-3f13231fe267"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.366695 5008 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8bd84e51-ab6a-4274-8245-3f13231fe267-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.366760 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8bd84e51-ab6a-4274-8245-3f13231fe267-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.699106 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ltp2s" Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.807452 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.807645 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8bd84e51-ab6a-4274-8245-3f13231fe267","Type":"ContainerDied","Data":"b29be4ecaae32bcf746fe225a1541a9f3115bd131d8963df8729d56a02c789fe"} Mar 18 18:06:06 crc kubenswrapper[5008]: I0318 18:06:06.807906 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b29be4ecaae32bcf746fe225a1541a9f3115bd131d8963df8729d56a02c789fe" Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.189831 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.288078 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb513699-bfbf-4ce3-8034-64633f18abf5-kubelet-dir\") pod \"eb513699-bfbf-4ce3-8034-64633f18abf5\" (UID: \"eb513699-bfbf-4ce3-8034-64633f18abf5\") " Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.288152 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb513699-bfbf-4ce3-8034-64633f18abf5-kube-api-access\") pod \"eb513699-bfbf-4ce3-8034-64633f18abf5\" (UID: \"eb513699-bfbf-4ce3-8034-64633f18abf5\") " Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.288258 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb513699-bfbf-4ce3-8034-64633f18abf5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eb513699-bfbf-4ce3-8034-64633f18abf5" (UID: "eb513699-bfbf-4ce3-8034-64633f18abf5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.288479 5008 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb513699-bfbf-4ce3-8034-64633f18abf5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.304836 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb513699-bfbf-4ce3-8034-64633f18abf5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eb513699-bfbf-4ce3-8034-64633f18abf5" (UID: "eb513699-bfbf-4ce3-8034-64633f18abf5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.389183 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb513699-bfbf-4ce3-8034-64633f18abf5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.821270 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eb513699-bfbf-4ce3-8034-64633f18abf5","Type":"ContainerDied","Data":"7e5b855572bd1cb372e5d2a928ec7b60378f27d92d82c901ef01cd22c5cedf54"} Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.821308 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e5b855572bd1cb372e5d2a928ec7b60378f27d92d82c901ef01cd22c5cedf54" Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.821368 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 18:06:07 crc kubenswrapper[5008]: I0318 18:06:07.884573 5008 ???:1] "http: TLS handshake error from 192.168.126.11:46904: no serving certificate available for the kubelet" Mar 18 18:06:09 crc kubenswrapper[5008]: I0318 18:06:09.550219 5008 ???:1] "http: TLS handshake error from 192.168.126.11:46914: no serving certificate available for the kubelet" Mar 18 18:06:11 crc kubenswrapper[5008]: I0318 18:06:11.064067 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-s5pml" Mar 18 18:06:11 crc kubenswrapper[5008]: I0318 18:06:11.069874 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:06:11 crc kubenswrapper[5008]: I0318 18:06:11.074122 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:06:15 crc kubenswrapper[5008]: I0318 18:06:15.311178 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:06:15 crc kubenswrapper[5008]: I0318 18:06:15.316675 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7-metrics-certs\") pod \"network-metrics-daemon-g2z9p\" (UID: \"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7\") " pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:06:15 crc kubenswrapper[5008]: I0318 18:06:15.334506 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g2z9p" Mar 18 18:06:17 crc kubenswrapper[5008]: I0318 18:06:17.946038 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-556dc587d9-jflz2"] Mar 18 18:06:17 crc kubenswrapper[5008]: I0318 18:06:17.946472 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" podUID="69b8e1e4-924d-476d-9479-b09d357092ae" containerName="controller-manager" containerID="cri-o://9905e140ab69d1f6cc79813a2287b4fb22489b0c258fe3e5d7b4e8b12705bf88" gracePeriod=30 Mar 18 18:06:17 crc kubenswrapper[5008]: I0318 18:06:17.951958 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm"] Mar 18 18:06:17 crc kubenswrapper[5008]: I0318 18:06:17.952422 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" podUID="a2a30953-fe71-40f7-86f5-9759485c1954" containerName="route-controller-manager" containerID="cri-o://83fa7069f998c98d03d98225ae3a00099c610d9e543ed21be2135d3f3c01f173" gracePeriod=30 Mar 18 18:06:18 crc kubenswrapper[5008]: I0318 18:06:18.916994 5008 generic.go:334] "Generic (PLEG): container finished" podID="a2a30953-fe71-40f7-86f5-9759485c1954" containerID="83fa7069f998c98d03d98225ae3a00099c610d9e543ed21be2135d3f3c01f173" exitCode=0 Mar 18 18:06:18 crc kubenswrapper[5008]: I0318 18:06:18.917117 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" event={"ID":"a2a30953-fe71-40f7-86f5-9759485c1954","Type":"ContainerDied","Data":"83fa7069f998c98d03d98225ae3a00099c610d9e543ed21be2135d3f3c01f173"} Mar 18 18:06:18 crc kubenswrapper[5008]: I0318 18:06:18.918993 5008 generic.go:334] "Generic (PLEG): container finished" podID="69b8e1e4-924d-476d-9479-b09d357092ae" containerID="9905e140ab69d1f6cc79813a2287b4fb22489b0c258fe3e5d7b4e8b12705bf88" exitCode=0 Mar 18 18:06:18 crc kubenswrapper[5008]: I0318 18:06:18.919182 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" event={"ID":"69b8e1e4-924d-476d-9479-b09d357092ae","Type":"ContainerDied","Data":"9905e140ab69d1f6cc79813a2287b4fb22489b0c258fe3e5d7b4e8b12705bf88"} Mar 18 18:06:20 crc kubenswrapper[5008]: I0318 18:06:20.685231 5008 patch_prober.go:28] interesting pod/controller-manager-556dc587d9-jflz2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 18 18:06:20 crc kubenswrapper[5008]: I0318 18:06:20.685304 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" podUID="69b8e1e4-924d-476d-9479-b09d357092ae" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 18 18:06:20 crc kubenswrapper[5008]: I0318 18:06:20.731486 5008 patch_prober.go:28] interesting pod/route-controller-manager-7d6b58f489-5lfcm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 18 18:06:20 crc kubenswrapper[5008]: I0318 18:06:20.731573 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" podUID="a2a30953-fe71-40f7-86f5-9759485c1954" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 18 18:06:21 crc kubenswrapper[5008]: I0318 18:06:21.071219 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:06:24 crc kubenswrapper[5008]: I0318 18:06:24.460932 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:06:24 crc kubenswrapper[5008]: I0318 18:06:24.461211 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.500431 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.500910 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:06:27 crc kubenswrapper[5008]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 18:06:27 crc kubenswrapper[5008]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pkgqk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29564284-6tsrt_openshift-infra(51d04574-0631-403a-8bf5-4127787463d7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 18:06:27 crc kubenswrapper[5008]: > logger="UnhandledError" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.502039 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" podUID="51d04574-0631-403a-8bf5-4127787463d7" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.520151 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.520414 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:06:27 crc kubenswrapper[5008]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 18:06:27 crc kubenswrapper[5008]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2cm9g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29564286-cfxqp_openshift-infra(5424805e-15fc-4424-8942-93f7095e148b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 18:06:27 crc kubenswrapper[5008]: > logger="UnhandledError" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.521707 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" podUID="5424805e-15fc-4424-8942-93f7095e148b" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.539958 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.546376 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.570597 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2"] Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.570872 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb513699-bfbf-4ce3-8034-64633f18abf5" containerName="pruner" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.570893 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb513699-bfbf-4ce3-8034-64633f18abf5" containerName="pruner" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.570902 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd84e51-ab6a-4274-8245-3f13231fe267" containerName="pruner" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.570908 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd84e51-ab6a-4274-8245-3f13231fe267" containerName="pruner" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.570915 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a30953-fe71-40f7-86f5-9759485c1954" containerName="route-controller-manager" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.570921 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a30953-fe71-40f7-86f5-9759485c1954" containerName="route-controller-manager" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.570936 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b8e1e4-924d-476d-9479-b09d357092ae" containerName="controller-manager" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.570942 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b8e1e4-924d-476d-9479-b09d357092ae" containerName="controller-manager" Mar 18 18:06:27 crc kubenswrapper[5008]: E0318 18:06:27.570956 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860a9876-b8f6-4125-bd1c-51518eb10283" containerName="collect-profiles" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.570962 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="860a9876-b8f6-4125-bd1c-51518eb10283" containerName="collect-profiles" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.571086 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="860a9876-b8f6-4125-bd1c-51518eb10283" containerName="collect-profiles" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.571096 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="69b8e1e4-924d-476d-9479-b09d357092ae" containerName="controller-manager" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.571103 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a30953-fe71-40f7-86f5-9759485c1954" containerName="route-controller-manager" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.571113 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb513699-bfbf-4ce3-8034-64633f18abf5" containerName="pruner" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.571122 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd84e51-ab6a-4274-8245-3f13231fe267" containerName="pruner" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.571521 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.606081 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2"] Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.688011 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-config\") pod \"a2a30953-fe71-40f7-86f5-9759485c1954\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.688819 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-config" (OuterVolumeSpecName: "config") pod "a2a30953-fe71-40f7-86f5-9759485c1954" (UID: "a2a30953-fe71-40f7-86f5-9759485c1954"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.688989 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb5th\" (UniqueName: \"kubernetes.io/projected/a2a30953-fe71-40f7-86f5-9759485c1954-kube-api-access-mb5th\") pod \"a2a30953-fe71-40f7-86f5-9759485c1954\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690087 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-client-ca\") pod \"69b8e1e4-924d-476d-9479-b09d357092ae\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690168 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2a30953-fe71-40f7-86f5-9759485c1954-serving-cert\") pod \"a2a30953-fe71-40f7-86f5-9759485c1954\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690206 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-config\") pod \"69b8e1e4-924d-476d-9479-b09d357092ae\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690232 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-proxy-ca-bundles\") pod \"69b8e1e4-924d-476d-9479-b09d357092ae\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690256 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8kj2\" (UniqueName: \"kubernetes.io/projected/69b8e1e4-924d-476d-9479-b09d357092ae-kube-api-access-x8kj2\") pod \"69b8e1e4-924d-476d-9479-b09d357092ae\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690285 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69b8e1e4-924d-476d-9479-b09d357092ae-serving-cert\") pod \"69b8e1e4-924d-476d-9479-b09d357092ae\" (UID: \"69b8e1e4-924d-476d-9479-b09d357092ae\") " Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690276 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-client-ca" (OuterVolumeSpecName: "client-ca") pod "69b8e1e4-924d-476d-9479-b09d357092ae" (UID: "69b8e1e4-924d-476d-9479-b09d357092ae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690309 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-client-ca\") pod \"a2a30953-fe71-40f7-86f5-9759485c1954\" (UID: \"a2a30953-fe71-40f7-86f5-9759485c1954\") " Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690441 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-config\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690464 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-client-ca\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690538 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb0a321a-69c5-47bf-89f3-539a6099a650-serving-cert\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690572 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pnjp\" (UniqueName: \"kubernetes.io/projected/fb0a321a-69c5-47bf-89f3-539a6099a650-kube-api-access-6pnjp\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690648 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690661 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690928 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-config" (OuterVolumeSpecName: "config") pod "69b8e1e4-924d-476d-9479-b09d357092ae" (UID: "69b8e1e4-924d-476d-9479-b09d357092ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.690949 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "69b8e1e4-924d-476d-9479-b09d357092ae" (UID: "69b8e1e4-924d-476d-9479-b09d357092ae"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.691843 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-client-ca" (OuterVolumeSpecName: "client-ca") pod "a2a30953-fe71-40f7-86f5-9759485c1954" (UID: "a2a30953-fe71-40f7-86f5-9759485c1954"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.695682 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a30953-fe71-40f7-86f5-9759485c1954-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a2a30953-fe71-40f7-86f5-9759485c1954" (UID: "a2a30953-fe71-40f7-86f5-9759485c1954"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.695892 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a30953-fe71-40f7-86f5-9759485c1954-kube-api-access-mb5th" (OuterVolumeSpecName: "kube-api-access-mb5th") pod "a2a30953-fe71-40f7-86f5-9759485c1954" (UID: "a2a30953-fe71-40f7-86f5-9759485c1954"). InnerVolumeSpecName "kube-api-access-mb5th". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.695963 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69b8e1e4-924d-476d-9479-b09d357092ae-kube-api-access-x8kj2" (OuterVolumeSpecName: "kube-api-access-x8kj2") pod "69b8e1e4-924d-476d-9479-b09d357092ae" (UID: "69b8e1e4-924d-476d-9479-b09d357092ae"). InnerVolumeSpecName "kube-api-access-x8kj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.698353 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b8e1e4-924d-476d-9479-b09d357092ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "69b8e1e4-924d-476d-9479-b09d357092ae" (UID: "69b8e1e4-924d-476d-9479-b09d357092ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792186 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb0a321a-69c5-47bf-89f3-539a6099a650-serving-cert\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792230 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pnjp\" (UniqueName: \"kubernetes.io/projected/fb0a321a-69c5-47bf-89f3-539a6099a650-kube-api-access-6pnjp\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792287 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-config\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792303 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-client-ca\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792344 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb5th\" (UniqueName: \"kubernetes.io/projected/a2a30953-fe71-40f7-86f5-9759485c1954-kube-api-access-mb5th\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792356 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2a30953-fe71-40f7-86f5-9759485c1954-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792365 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792376 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69b8e1e4-924d-476d-9479-b09d357092ae-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792385 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8kj2\" (UniqueName: \"kubernetes.io/projected/69b8e1e4-924d-476d-9479-b09d357092ae-kube-api-access-x8kj2\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792393 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69b8e1e4-924d-476d-9479-b09d357092ae-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.792401 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2a30953-fe71-40f7-86f5-9759485c1954-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.793677 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-client-ca\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.793826 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-config\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.796899 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb0a321a-69c5-47bf-89f3-539a6099a650-serving-cert\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.808429 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pnjp\" (UniqueName: \"kubernetes.io/projected/fb0a321a-69c5-47bf-89f3-539a6099a650-kube-api-access-6pnjp\") pod \"route-controller-manager-68854d77bf-62mq2\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.900290 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.997842 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" event={"ID":"69b8e1e4-924d-476d-9479-b09d357092ae","Type":"ContainerDied","Data":"7b6cb8a9529db54694718d487561936322dbd03a17b9f32786be56910bd7de77"} Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.997893 5008 scope.go:117] "RemoveContainer" containerID="9905e140ab69d1f6cc79813a2287b4fb22489b0c258fe3e5d7b4e8b12705bf88" Mar 18 18:06:27 crc kubenswrapper[5008]: I0318 18:06:27.997925 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-556dc587d9-jflz2" Mar 18 18:06:28 crc kubenswrapper[5008]: I0318 18:06:28.007374 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" event={"ID":"a2a30953-fe71-40f7-86f5-9759485c1954","Type":"ContainerDied","Data":"504cf8b171aa29c9dfbc8d950b63cb25317327a06978f12ee6a36ea17cbf9cfd"} Mar 18 18:06:28 crc kubenswrapper[5008]: I0318 18:06:28.007621 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm" Mar 18 18:06:28 crc kubenswrapper[5008]: E0318 18:06:28.009719 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" podUID="51d04574-0631-403a-8bf5-4127787463d7" Mar 18 18:06:28 crc kubenswrapper[5008]: E0318 18:06:28.009795 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" podUID="5424805e-15fc-4424-8942-93f7095e148b" Mar 18 18:06:28 crc kubenswrapper[5008]: I0318 18:06:28.054803 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-556dc587d9-jflz2"] Mar 18 18:06:28 crc kubenswrapper[5008]: I0318 18:06:28.056692 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-556dc587d9-jflz2"] Mar 18 18:06:28 crc kubenswrapper[5008]: I0318 18:06:28.061863 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm"] Mar 18 18:06:28 crc kubenswrapper[5008]: I0318 18:06:28.065618 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d6b58f489-5lfcm"] Mar 18 18:06:28 crc kubenswrapper[5008]: I0318 18:06:28.207273 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69b8e1e4-924d-476d-9479-b09d357092ae" path="/var/lib/kubelet/pods/69b8e1e4-924d-476d-9479-b09d357092ae/volumes" Mar 18 18:06:28 crc kubenswrapper[5008]: I0318 18:06:28.208078 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a30953-fe71-40f7-86f5-9759485c1954" path="/var/lib/kubelet/pods/a2a30953-fe71-40f7-86f5-9759485c1954/volumes" Mar 18 18:06:29 crc kubenswrapper[5008]: E0318 18:06:29.081361 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 18:06:29 crc kubenswrapper[5008]: E0318 18:06:29.081896 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gk5n2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d2kl6_openshift-marketplace(f4f135d3-9f42-4d08-bfd2-71aa61f5e01f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:06:29 crc kubenswrapper[5008]: E0318 18:06:29.083125 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d2kl6" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.053298 5008 ???:1] "http: TLS handshake error from 192.168.126.11:33050: no serving certificate available for the kubelet" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.335604 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59c9df7dd-6bgc7"] Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.336998 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.339410 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.339514 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.339698 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.339920 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.340076 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.340395 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59c9df7dd-6bgc7"] Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.341308 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.345226 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.426441 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-proxy-ca-bundles\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.426480 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-config\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.426509 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-client-ca\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.426633 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm76d\" (UniqueName: \"kubernetes.io/projected/62c2806d-3070-48d3-9cf8-7efc27c73c53-kube-api-access-tm76d\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.426685 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c2806d-3070-48d3-9cf8-7efc27c73c53-serving-cert\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.528234 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm76d\" (UniqueName: \"kubernetes.io/projected/62c2806d-3070-48d3-9cf8-7efc27c73c53-kube-api-access-tm76d\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.528297 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c2806d-3070-48d3-9cf8-7efc27c73c53-serving-cert\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.528353 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-proxy-ca-bundles\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.529177 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-config\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.529208 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-client-ca\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.529778 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-proxy-ca-bundles\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.530140 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-client-ca\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.530747 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-config\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.534442 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c2806d-3070-48d3-9cf8-7efc27c73c53-serving-cert\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.542958 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm76d\" (UniqueName: \"kubernetes.io/projected/62c2806d-3070-48d3-9cf8-7efc27c73c53-kube-api-access-tm76d\") pod \"controller-manager-59c9df7dd-6bgc7\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:30 crc kubenswrapper[5008]: I0318 18:06:30.676308 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:31 crc kubenswrapper[5008]: I0318 18:06:31.729473 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9fjbs" Mar 18 18:06:32 crc kubenswrapper[5008]: E0318 18:06:32.613794 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d2kl6" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" Mar 18 18:06:32 crc kubenswrapper[5008]: E0318 18:06:32.726798 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 18:06:32 crc kubenswrapper[5008]: E0318 18:06:32.726964 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jj4hb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dvkpk_openshift-marketplace(8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:06:32 crc kubenswrapper[5008]: E0318 18:06:32.728364 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dvkpk" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" Mar 18 18:06:34 crc kubenswrapper[5008]: E0318 18:06:34.115059 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dvkpk" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" Mar 18 18:06:34 crc kubenswrapper[5008]: E0318 18:06:34.177638 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 18:06:34 crc kubenswrapper[5008]: E0318 18:06:34.177837 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xj8sf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fln9j_openshift-marketplace(d548a7a8-a808-45d3-91be-b0e9242383ec): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:06:34 crc kubenswrapper[5008]: E0318 18:06:34.179071 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fln9j" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.648026 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-fln9j" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" Mar 18 18:06:35 crc kubenswrapper[5008]: I0318 18:06:35.686991 5008 scope.go:117] "RemoveContainer" containerID="83fa7069f998c98d03d98225ae3a00099c610d9e543ed21be2135d3f3c01f173" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.761031 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.761195 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tw8rk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vrtz8_openshift-marketplace(5d041acc-48d2-4f2f-896f-94893b9ff41f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.761441 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.761618 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-947tz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b59s5_openshift-marketplace(9209339f-be23-444c-a635-04920e6a0cf6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.762949 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-b59s5" podUID="9209339f-be23-444c-a635-04920e6a0cf6" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.762963 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vrtz8" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.789314 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.789763 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4lp5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-krh4d_openshift-marketplace(e03ee689-ed4f-4b64-9e4a-4d6febd71716): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.791314 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-krh4d" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.793538 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.793727 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rdtl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5nhgk_openshift-marketplace(5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.794967 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5nhgk" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.865222 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.865394 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5p7jc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xth4j_openshift-marketplace(420d2432-4da0-4be3-8489-56c06a682e03): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 18:06:35 crc kubenswrapper[5008]: E0318 18:06:35.866584 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xth4j" podUID="420d2432-4da0-4be3-8489-56c06a682e03" Mar 18 18:06:35 crc kubenswrapper[5008]: I0318 18:06:35.933392 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59c9df7dd-6bgc7"] Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.049681 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" event={"ID":"62c2806d-3070-48d3-9cf8-7efc27c73c53","Type":"ContainerStarted","Data":"f8a122136f951a1c64a72af664ee6a599a5610a1c43f9267c598275464ad3ee1"} Mar 18 18:06:36 crc kubenswrapper[5008]: E0318 18:06:36.051280 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5nhgk" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" Mar 18 18:06:36 crc kubenswrapper[5008]: E0318 18:06:36.051388 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xth4j" podUID="420d2432-4da0-4be3-8489-56c06a682e03" Mar 18 18:06:36 crc kubenswrapper[5008]: E0318 18:06:36.051527 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-krh4d" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" Mar 18 18:06:36 crc kubenswrapper[5008]: E0318 18:06:36.052270 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b59s5" podUID="9209339f-be23-444c-a635-04920e6a0cf6" Mar 18 18:06:36 crc kubenswrapper[5008]: E0318 18:06:36.052387 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vrtz8" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.224158 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g2z9p"] Mar 18 18:06:36 crc kubenswrapper[5008]: W0318 18:06:36.230218 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ae9a1f3_c9f8_4b4a_9d4f_0f3fb900aab7.slice/crio-823b0b0f9c213d7d1a0761d6c502da4c3a7a6a923fb6d17677a8ddddff5837fb WatchSource:0}: Error finding container 823b0b0f9c213d7d1a0761d6c502da4c3a7a6a923fb6d17677a8ddddff5837fb: Status 404 returned error can't find the container with id 823b0b0f9c213d7d1a0761d6c502da4c3a7a6a923fb6d17677a8ddddff5837fb Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.245840 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2"] Mar 18 18:06:36 crc kubenswrapper[5008]: W0318 18:06:36.255779 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb0a321a_69c5_47bf_89f3_539a6099a650.slice/crio-8f83879667aaf25c94f94000e3e18e28960bd0fb4ad3d830422b39b54ecff3dc WatchSource:0}: Error finding container 8f83879667aaf25c94f94000e3e18e28960bd0fb4ad3d830422b39b54ecff3dc: Status 404 returned error can't find the container with id 8f83879667aaf25c94f94000e3e18e28960bd0fb4ad3d830422b39b54ecff3dc Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.293689 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.294598 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.299922 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.301024 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.301302 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.402773 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33d0dd4b-ff46-4825-b509-b4dd89e78918-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"33d0dd4b-ff46-4825-b509-b4dd89e78918\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.402905 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33d0dd4b-ff46-4825-b509-b4dd89e78918-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"33d0dd4b-ff46-4825-b509-b4dd89e78918\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.504328 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33d0dd4b-ff46-4825-b509-b4dd89e78918-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"33d0dd4b-ff46-4825-b509-b4dd89e78918\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.504508 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33d0dd4b-ff46-4825-b509-b4dd89e78918-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"33d0dd4b-ff46-4825-b509-b4dd89e78918\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.504661 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33d0dd4b-ff46-4825-b509-b4dd89e78918-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"33d0dd4b-ff46-4825-b509-b4dd89e78918\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.530763 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33d0dd4b-ff46-4825-b509-b4dd89e78918-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"33d0dd4b-ff46-4825-b509-b4dd89e78918\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:36 crc kubenswrapper[5008]: I0318 18:06:36.614860 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.057084 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" event={"ID":"62c2806d-3070-48d3-9cf8-7efc27c73c53","Type":"ContainerStarted","Data":"8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054"} Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.057747 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.059206 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" event={"ID":"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7","Type":"ContainerStarted","Data":"4ea1bba22881f8f0f68733f038e2f65ce66bf4dd3d98ee29d182e2dbf34e6f51"} Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.059247 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" event={"ID":"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7","Type":"ContainerStarted","Data":"7fb4e4a486184f28ac8c3102fa74df394bd1942543bb02f5484c779eb6b4eb1e"} Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.059261 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g2z9p" event={"ID":"1ae9a1f3-c9f8-4b4a-9d4f-0f3fb900aab7","Type":"ContainerStarted","Data":"823b0b0f9c213d7d1a0761d6c502da4c3a7a6a923fb6d17677a8ddddff5837fb"} Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.060858 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" event={"ID":"fb0a321a-69c5-47bf-89f3-539a6099a650","Type":"ContainerStarted","Data":"58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996"} Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.060884 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" event={"ID":"fb0a321a-69c5-47bf-89f3-539a6099a650","Type":"ContainerStarted","Data":"8f83879667aaf25c94f94000e3e18e28960bd0fb4ad3d830422b39b54ecff3dc"} Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.061093 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.063426 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.079169 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" podStartSLOduration=20.079148166 podStartE2EDuration="20.079148166s" podCreationTimestamp="2026-03-18 18:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:37.076220878 +0000 UTC m=+253.595693967" watchObservedRunningTime="2026-03-18 18:06:37.079148166 +0000 UTC m=+253.598621245" Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.108186 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.139442 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g2z9p" podStartSLOduration=202.139422554 podStartE2EDuration="3m22.139422554s" podCreationTimestamp="2026-03-18 18:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:37.136134965 +0000 UTC m=+253.655608044" watchObservedRunningTime="2026-03-18 18:06:37.139422554 +0000 UTC m=+253.658895633" Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.209748 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.247078 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" podStartSLOduration=20.247054832 podStartE2EDuration="20.247054832s" podCreationTimestamp="2026-03-18 18:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:37.160890247 +0000 UTC m=+253.680363326" watchObservedRunningTime="2026-03-18 18:06:37.247054832 +0000 UTC m=+253.766527911" Mar 18 18:06:37 crc kubenswrapper[5008]: I0318 18:06:37.945507 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59c9df7dd-6bgc7"] Mar 18 18:06:38 crc kubenswrapper[5008]: I0318 18:06:38.017947 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2"] Mar 18 18:06:38 crc kubenswrapper[5008]: I0318 18:06:38.068700 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33d0dd4b-ff46-4825-b509-b4dd89e78918","Type":"ContainerStarted","Data":"c44709dede195a9508f08f3782339da233342850a9bc01de3fcef26d5ef87a81"} Mar 18 18:06:38 crc kubenswrapper[5008]: I0318 18:06:38.068740 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33d0dd4b-ff46-4825-b509-b4dd89e78918","Type":"ContainerStarted","Data":"5f3ade4c403e71206fe01058173c2462ab462b415095ad6b3a9547cd835df2e5"} Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.076102 5008 generic.go:334] "Generic (PLEG): container finished" podID="33d0dd4b-ff46-4825-b509-b4dd89e78918" containerID="c44709dede195a9508f08f3782339da233342850a9bc01de3fcef26d5ef87a81" exitCode=0 Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.076224 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33d0dd4b-ff46-4825-b509-b4dd89e78918","Type":"ContainerDied","Data":"c44709dede195a9508f08f3782339da233342850a9bc01de3fcef26d5ef87a81"} Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.076318 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" podUID="62c2806d-3070-48d3-9cf8-7efc27c73c53" containerName="controller-manager" containerID="cri-o://8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054" gracePeriod=30 Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.076528 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" podUID="fb0a321a-69c5-47bf-89f3-539a6099a650" containerName="route-controller-manager" containerID="cri-o://58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996" gracePeriod=30 Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.321223 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.379382 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-z9ssp"] Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.444168 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33d0dd4b-ff46-4825-b509-b4dd89e78918-kube-api-access\") pod \"33d0dd4b-ff46-4825-b509-b4dd89e78918\" (UID: \"33d0dd4b-ff46-4825-b509-b4dd89e78918\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.444297 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33d0dd4b-ff46-4825-b509-b4dd89e78918-kubelet-dir\") pod \"33d0dd4b-ff46-4825-b509-b4dd89e78918\" (UID: \"33d0dd4b-ff46-4825-b509-b4dd89e78918\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.444664 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33d0dd4b-ff46-4825-b509-b4dd89e78918-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "33d0dd4b-ff46-4825-b509-b4dd89e78918" (UID: "33d0dd4b-ff46-4825-b509-b4dd89e78918"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.454834 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d0dd4b-ff46-4825-b509-b4dd89e78918-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "33d0dd4b-ff46-4825-b509-b4dd89e78918" (UID: "33d0dd4b-ff46-4825-b509-b4dd89e78918"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.546212 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33d0dd4b-ff46-4825-b509-b4dd89e78918-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.546246 5008 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33d0dd4b-ff46-4825-b509-b4dd89e78918-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.563127 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.579449 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.646917 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm76d\" (UniqueName: \"kubernetes.io/projected/62c2806d-3070-48d3-9cf8-7efc27c73c53-kube-api-access-tm76d\") pod \"62c2806d-3070-48d3-9cf8-7efc27c73c53\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.647204 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pnjp\" (UniqueName: \"kubernetes.io/projected/fb0a321a-69c5-47bf-89f3-539a6099a650-kube-api-access-6pnjp\") pod \"fb0a321a-69c5-47bf-89f3-539a6099a650\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.647230 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-config\") pod \"fb0a321a-69c5-47bf-89f3-539a6099a650\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.647250 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-client-ca\") pod \"fb0a321a-69c5-47bf-89f3-539a6099a650\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.647264 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c2806d-3070-48d3-9cf8-7efc27c73c53-serving-cert\") pod \"62c2806d-3070-48d3-9cf8-7efc27c73c53\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.647289 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-proxy-ca-bundles\") pod \"62c2806d-3070-48d3-9cf8-7efc27c73c53\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.647575 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb0a321a-69c5-47bf-89f3-539a6099a650-serving-cert\") pod \"fb0a321a-69c5-47bf-89f3-539a6099a650\" (UID: \"fb0a321a-69c5-47bf-89f3-539a6099a650\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.647653 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-config\") pod \"62c2806d-3070-48d3-9cf8-7efc27c73c53\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.647674 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-client-ca\") pod \"62c2806d-3070-48d3-9cf8-7efc27c73c53\" (UID: \"62c2806d-3070-48d3-9cf8-7efc27c73c53\") " Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.647907 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "62c2806d-3070-48d3-9cf8-7efc27c73c53" (UID: "62c2806d-3070-48d3-9cf8-7efc27c73c53"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.648062 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-client-ca" (OuterVolumeSpecName: "client-ca") pod "fb0a321a-69c5-47bf-89f3-539a6099a650" (UID: "fb0a321a-69c5-47bf-89f3-539a6099a650"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.648398 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-config" (OuterVolumeSpecName: "config") pod "62c2806d-3070-48d3-9cf8-7efc27c73c53" (UID: "62c2806d-3070-48d3-9cf8-7efc27c73c53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.648410 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-client-ca" (OuterVolumeSpecName: "client-ca") pod "62c2806d-3070-48d3-9cf8-7efc27c73c53" (UID: "62c2806d-3070-48d3-9cf8-7efc27c73c53"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.648474 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-config" (OuterVolumeSpecName: "config") pod "fb0a321a-69c5-47bf-89f3-539a6099a650" (UID: "fb0a321a-69c5-47bf-89f3-539a6099a650"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.650097 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c2806d-3070-48d3-9cf8-7efc27c73c53-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "62c2806d-3070-48d3-9cf8-7efc27c73c53" (UID: "62c2806d-3070-48d3-9cf8-7efc27c73c53"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.650372 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c2806d-3070-48d3-9cf8-7efc27c73c53-kube-api-access-tm76d" (OuterVolumeSpecName: "kube-api-access-tm76d") pod "62c2806d-3070-48d3-9cf8-7efc27c73c53" (UID: "62c2806d-3070-48d3-9cf8-7efc27c73c53"). InnerVolumeSpecName "kube-api-access-tm76d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.650764 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0a321a-69c5-47bf-89f3-539a6099a650-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fb0a321a-69c5-47bf-89f3-539a6099a650" (UID: "fb0a321a-69c5-47bf-89f3-539a6099a650"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.651704 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0a321a-69c5-47bf-89f3-539a6099a650-kube-api-access-6pnjp" (OuterVolumeSpecName: "kube-api-access-6pnjp") pod "fb0a321a-69c5-47bf-89f3-539a6099a650" (UID: "fb0a321a-69c5-47bf-89f3-539a6099a650"). InnerVolumeSpecName "kube-api-access-6pnjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.748889 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.748926 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.748936 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm76d\" (UniqueName: \"kubernetes.io/projected/62c2806d-3070-48d3-9cf8-7efc27c73c53-kube-api-access-tm76d\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.748949 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pnjp\" (UniqueName: \"kubernetes.io/projected/fb0a321a-69c5-47bf-89f3-539a6099a650-kube-api-access-6pnjp\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.748958 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.748966 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0a321a-69c5-47bf-89f3-539a6099a650-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.748975 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62c2806d-3070-48d3-9cf8-7efc27c73c53-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.748985 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62c2806d-3070-48d3-9cf8-7efc27c73c53-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:39 crc kubenswrapper[5008]: I0318 18:06:39.748995 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb0a321a-69c5-47bf-89f3-539a6099a650-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.085763 5008 generic.go:334] "Generic (PLEG): container finished" podID="fb0a321a-69c5-47bf-89f3-539a6099a650" containerID="58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996" exitCode=0 Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.086536 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.089753 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" event={"ID":"fb0a321a-69c5-47bf-89f3-539a6099a650","Type":"ContainerDied","Data":"58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996"} Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.089819 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2" event={"ID":"fb0a321a-69c5-47bf-89f3-539a6099a650","Type":"ContainerDied","Data":"8f83879667aaf25c94f94000e3e18e28960bd0fb4ad3d830422b39b54ecff3dc"} Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.089843 5008 scope.go:117] "RemoveContainer" containerID="58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.092216 5008 generic.go:334] "Generic (PLEG): container finished" podID="62c2806d-3070-48d3-9cf8-7efc27c73c53" containerID="8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054" exitCode=0 Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.092287 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" event={"ID":"62c2806d-3070-48d3-9cf8-7efc27c73c53","Type":"ContainerDied","Data":"8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054"} Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.092315 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" event={"ID":"62c2806d-3070-48d3-9cf8-7efc27c73c53","Type":"ContainerDied","Data":"f8a122136f951a1c64a72af664ee6a599a5610a1c43f9267c598275464ad3ee1"} Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.092550 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59c9df7dd-6bgc7" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.097296 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"33d0dd4b-ff46-4825-b509-b4dd89e78918","Type":"ContainerDied","Data":"5f3ade4c403e71206fe01058173c2462ab462b415095ad6b3a9547cd835df2e5"} Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.097327 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f3ade4c403e71206fe01058173c2462ab462b415095ad6b3a9547cd835df2e5" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.097737 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.110850 5008 scope.go:117] "RemoveContainer" containerID="58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996" Mar 18 18:06:40 crc kubenswrapper[5008]: E0318 18:06:40.111915 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996\": container with ID starting with 58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996 not found: ID does not exist" containerID="58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.111981 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996"} err="failed to get container status \"58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996\": rpc error: code = NotFound desc = could not find container \"58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996\": container with ID starting with 58359aa3f4436e973624b664d07a446115f5f89707e96d1b03c930a92b5fa996 not found: ID does not exist" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.112011 5008 scope.go:117] "RemoveContainer" containerID="8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.125303 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2"] Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.130143 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68854d77bf-62mq2"] Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.133504 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59c9df7dd-6bgc7"] Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.133999 5008 scope.go:117] "RemoveContainer" containerID="8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054" Mar 18 18:06:40 crc kubenswrapper[5008]: E0318 18:06:40.134649 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054\": container with ID starting with 8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054 not found: ID does not exist" containerID="8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.134791 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054"} err="failed to get container status \"8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054\": rpc error: code = NotFound desc = could not find container \"8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054\": container with ID starting with 8be6759b34c0c2e27b99facfd5bcd9de534fa29ec770f5e9a174e2ad9b588054 not found: ID does not exist" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.136326 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-59c9df7dd-6bgc7"] Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.222086 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c2806d-3070-48d3-9cf8-7efc27c73c53" path="/var/lib/kubelet/pods/62c2806d-3070-48d3-9cf8-7efc27c73c53/volumes" Mar 18 18:06:40 crc kubenswrapper[5008]: I0318 18:06:40.223013 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb0a321a-69c5-47bf-89f3-539a6099a650" path="/var/lib/kubelet/pods/fb0a321a-69c5-47bf-89f3-539a6099a650/volumes" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.060307 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 18:06:42 crc kubenswrapper[5008]: E0318 18:06:42.060886 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d0dd4b-ff46-4825-b509-b4dd89e78918" containerName="pruner" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.060902 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d0dd4b-ff46-4825-b509-b4dd89e78918" containerName="pruner" Mar 18 18:06:42 crc kubenswrapper[5008]: E0318 18:06:42.060923 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c2806d-3070-48d3-9cf8-7efc27c73c53" containerName="controller-manager" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.060935 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c2806d-3070-48d3-9cf8-7efc27c73c53" containerName="controller-manager" Mar 18 18:06:42 crc kubenswrapper[5008]: E0318 18:06:42.060947 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb0a321a-69c5-47bf-89f3-539a6099a650" containerName="route-controller-manager" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.060955 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0a321a-69c5-47bf-89f3-539a6099a650" containerName="route-controller-manager" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.061103 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d0dd4b-ff46-4825-b509-b4dd89e78918" containerName="pruner" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.061118 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb0a321a-69c5-47bf-89f3-539a6099a650" containerName="route-controller-manager" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.061130 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c2806d-3070-48d3-9cf8-7efc27c73c53" containerName="controller-manager" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.061537 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.064409 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.066442 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.073381 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.126648 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" event={"ID":"5424805e-15fc-4424-8942-93f7095e148b","Type":"ContainerStarted","Data":"2654336f098a4c27f83fdd77a681cf3086db6e716a1aa17b63985649718fd08c"} Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.215010 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-var-lock\") pod \"installer-9-crc\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.215071 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.215197 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kube-api-access\") pod \"installer-9-crc\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.316588 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-var-lock\") pod \"installer-9-crc\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.316978 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.317096 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kube-api-access\") pod \"installer-9-crc\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.317471 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.316717 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-var-lock\") pod \"installer-9-crc\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.330257 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d545d5c54-8g2kl"] Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.330993 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.332969 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.333393 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.333603 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.333481 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.334086 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.334203 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.336918 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4"] Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.339006 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.339606 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.341861 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kube-api-access\") pod \"installer-9-crc\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.342257 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.342438 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.344451 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.344456 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.344594 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.349000 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.356653 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4"] Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.392569 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d545d5c54-8g2kl"] Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.418912 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.420367 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-config\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.420497 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-client-ca\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.420686 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg644\" (UniqueName: \"kubernetes.io/projected/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-kube-api-access-sg644\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.420807 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-serving-cert\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.420959 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-proxy-ca-bundles\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.434283 5008 csr.go:261] certificate signing request csr-m7klk is approved, waiting to be issued Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.443835 5008 csr.go:257] certificate signing request csr-m7klk is issued Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.522264 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqcf5\" (UniqueName: \"kubernetes.io/projected/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-kube-api-access-nqcf5\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.522348 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-proxy-ca-bundles\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.522376 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-config\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.522395 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-client-ca\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.522411 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-client-ca\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.522461 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-serving-cert\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.522482 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg644\" (UniqueName: \"kubernetes.io/projected/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-kube-api-access-sg644\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.522504 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-config\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.522520 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-serving-cert\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.523578 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-client-ca\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.523787 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-proxy-ca-bundles\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.524578 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-config\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.538219 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-serving-cert\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.540402 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg644\" (UniqueName: \"kubernetes.io/projected/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-kube-api-access-sg644\") pod \"controller-manager-5d545d5c54-8g2kl\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.623630 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-client-ca\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.623920 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-serving-cert\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.624025 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-config\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.624118 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqcf5\" (UniqueName: \"kubernetes.io/projected/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-kube-api-access-nqcf5\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.624367 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-client-ca\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.625097 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-config\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.626534 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-serving-cert\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.642798 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqcf5\" (UniqueName: \"kubernetes.io/projected/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-kube-api-access-nqcf5\") pod \"route-controller-manager-85b9df6bdf-jxwl4\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.661982 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:42 crc kubenswrapper[5008]: I0318 18:06:42.670696 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:43 crc kubenswrapper[5008]: I0318 18:06:43.155341 5008 generic.go:334] "Generic (PLEG): container finished" podID="5424805e-15fc-4424-8942-93f7095e148b" containerID="2654336f098a4c27f83fdd77a681cf3086db6e716a1aa17b63985649718fd08c" exitCode=0 Mar 18 18:06:43 crc kubenswrapper[5008]: I0318 18:06:43.155406 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" event={"ID":"5424805e-15fc-4424-8942-93f7095e148b","Type":"ContainerDied","Data":"2654336f098a4c27f83fdd77a681cf3086db6e716a1aa17b63985649718fd08c"} Mar 18 18:06:43 crc kubenswrapper[5008]: I0318 18:06:43.350247 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 18:06:43 crc kubenswrapper[5008]: W0318 18:06:43.367383 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod78c1c7b1_ecc0_4966_9be8_536fcd15335e.slice/crio-3ff185ab969673eaa33bc63ea1cb609a3148a80e0b3759b680e41f82d1fa19f3 WatchSource:0}: Error finding container 3ff185ab969673eaa33bc63ea1cb609a3148a80e0b3759b680e41f82d1fa19f3: Status 404 returned error can't find the container with id 3ff185ab969673eaa33bc63ea1cb609a3148a80e0b3759b680e41f82d1fa19f3 Mar 18 18:06:43 crc kubenswrapper[5008]: I0318 18:06:43.446167 5008 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-10 18:32:39.939624868 +0000 UTC Mar 18 18:06:43 crc kubenswrapper[5008]: I0318 18:06:43.446501 5008 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7152h25m56.493128642s for next certificate rotation Mar 18 18:06:43 crc kubenswrapper[5008]: I0318 18:06:43.623756 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d545d5c54-8g2kl"] Mar 18 18:06:43 crc kubenswrapper[5008]: I0318 18:06:43.626522 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4"] Mar 18 18:06:43 crc kubenswrapper[5008]: W0318 18:06:43.633503 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d879744_4d48_4e8d_b5ab_a3e886cc9ae5.slice/crio-13708c303caccff3b406a4ce4a4b5e030da08c8d22f3528a3e41a641673aaffe WatchSource:0}: Error finding container 13708c303caccff3b406a4ce4a4b5e030da08c8d22f3528a3e41a641673aaffe: Status 404 returned error can't find the container with id 13708c303caccff3b406a4ce4a4b5e030da08c8d22f3528a3e41a641673aaffe Mar 18 18:06:43 crc kubenswrapper[5008]: W0318 18:06:43.637980 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7ebf070_7e6d_4215_afc9_6f7c7df4a442.slice/crio-333fcadb0f4a3d32811243a75f4a9cbd20747038abd6dbeed972327e5edcd4be WatchSource:0}: Error finding container 333fcadb0f4a3d32811243a75f4a9cbd20747038abd6dbeed972327e5edcd4be: Status 404 returned error can't find the container with id 333fcadb0f4a3d32811243a75f4a9cbd20747038abd6dbeed972327e5edcd4be Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.163891 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" event={"ID":"a7ebf070-7e6d-4215-afc9-6f7c7df4a442","Type":"ContainerStarted","Data":"333fcadb0f4a3d32811243a75f4a9cbd20747038abd6dbeed972327e5edcd4be"} Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.165591 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"78c1c7b1-ecc0-4966-9be8-536fcd15335e","Type":"ContainerStarted","Data":"4445bddcdf5ab64c32dc31c58508ec659b76b545703ad1b1058e8322c481ebc5"} Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.165647 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"78c1c7b1-ecc0-4966-9be8-536fcd15335e","Type":"ContainerStarted","Data":"3ff185ab969673eaa33bc63ea1cb609a3148a80e0b3759b680e41f82d1fa19f3"} Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.167585 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" event={"ID":"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5","Type":"ContainerStarted","Data":"835f9210493eb9d6f3f3da8828233eaef4fa37e69eb615f75b66be6d3fe5b049"} Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.167638 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" event={"ID":"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5","Type":"ContainerStarted","Data":"13708c303caccff3b406a4ce4a4b5e030da08c8d22f3528a3e41a641673aaffe"} Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.169184 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" event={"ID":"51d04574-0631-403a-8bf5-4127787463d7","Type":"ContainerStarted","Data":"be96d29eb9bb52d2253258f92c3072bdf91e838b5357141b7231eb29b432cf6e"} Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.556599 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.656407 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cm9g\" (UniqueName: \"kubernetes.io/projected/5424805e-15fc-4424-8942-93f7095e148b-kube-api-access-2cm9g\") pod \"5424805e-15fc-4424-8942-93f7095e148b\" (UID: \"5424805e-15fc-4424-8942-93f7095e148b\") " Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.661292 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5424805e-15fc-4424-8942-93f7095e148b-kube-api-access-2cm9g" (OuterVolumeSpecName: "kube-api-access-2cm9g") pod "5424805e-15fc-4424-8942-93f7095e148b" (UID: "5424805e-15fc-4424-8942-93f7095e148b"). InnerVolumeSpecName "kube-api-access-2cm9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:44 crc kubenswrapper[5008]: I0318 18:06:44.757961 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cm9g\" (UniqueName: \"kubernetes.io/projected/5424805e-15fc-4424-8942-93f7095e148b-kube-api-access-2cm9g\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.176803 5008 generic.go:334] "Generic (PLEG): container finished" podID="51d04574-0631-403a-8bf5-4127787463d7" containerID="be96d29eb9bb52d2253258f92c3072bdf91e838b5357141b7231eb29b432cf6e" exitCode=0 Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.176868 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" event={"ID":"51d04574-0631-403a-8bf5-4127787463d7","Type":"ContainerDied","Data":"be96d29eb9bb52d2253258f92c3072bdf91e838b5357141b7231eb29b432cf6e"} Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.179491 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" event={"ID":"5424805e-15fc-4424-8942-93f7095e148b","Type":"ContainerDied","Data":"b07a05049815bff95951405e533e0ac9c6108bf2240fca351740f0af62e58dd6"} Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.179594 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b07a05049815bff95951405e533e0ac9c6108bf2240fca351740f0af62e58dd6" Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.179702 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564286-cfxqp" Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.181586 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" event={"ID":"a7ebf070-7e6d-4215-afc9-6f7c7df4a442","Type":"ContainerStarted","Data":"40ebc4abc0cdb56219f4a9b24c8b35056bb33b7461ba40ce098252a0e6f14681"} Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.181866 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.188106 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.224746 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" podStartSLOduration=7.2247001730000004 podStartE2EDuration="7.224700173s" podCreationTimestamp="2026-03-18 18:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:45.220986976 +0000 UTC m=+261.740460075" watchObservedRunningTime="2026-03-18 18:06:45.224700173 +0000 UTC m=+261.744173252" Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.279724 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" podStartSLOduration=8.279700247 podStartE2EDuration="8.279700247s" podCreationTimestamp="2026-03-18 18:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:45.275705172 +0000 UTC m=+261.795178261" watchObservedRunningTime="2026-03-18 18:06:45.279700247 +0000 UTC m=+261.799173326" Mar 18 18:06:45 crc kubenswrapper[5008]: I0318 18:06:45.280349 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.280343394 podStartE2EDuration="3.280343394s" podCreationTimestamp="2026-03-18 18:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:06:45.238777563 +0000 UTC m=+261.758250642" watchObservedRunningTime="2026-03-18 18:06:45.280343394 +0000 UTC m=+261.799816473" Mar 18 18:06:46 crc kubenswrapper[5008]: I0318 18:06:46.189846 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:46 crc kubenswrapper[5008]: I0318 18:06:46.197211 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:46 crc kubenswrapper[5008]: I0318 18:06:46.539114 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" Mar 18 18:06:46 crc kubenswrapper[5008]: I0318 18:06:46.690001 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkgqk\" (UniqueName: \"kubernetes.io/projected/51d04574-0631-403a-8bf5-4127787463d7-kube-api-access-pkgqk\") pod \"51d04574-0631-403a-8bf5-4127787463d7\" (UID: \"51d04574-0631-403a-8bf5-4127787463d7\") " Mar 18 18:06:46 crc kubenswrapper[5008]: I0318 18:06:46.698946 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d04574-0631-403a-8bf5-4127787463d7-kube-api-access-pkgqk" (OuterVolumeSpecName: "kube-api-access-pkgqk") pod "51d04574-0631-403a-8bf5-4127787463d7" (UID: "51d04574-0631-403a-8bf5-4127787463d7"). InnerVolumeSpecName "kube-api-access-pkgqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:46 crc kubenswrapper[5008]: I0318 18:06:46.791828 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkgqk\" (UniqueName: \"kubernetes.io/projected/51d04574-0631-403a-8bf5-4127787463d7-kube-api-access-pkgqk\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:47 crc kubenswrapper[5008]: I0318 18:06:47.196317 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" event={"ID":"51d04574-0631-403a-8bf5-4127787463d7","Type":"ContainerDied","Data":"005063dad0bdd4186802503e453f79d06cce557cc0ca364ed8c98b6d1f00b40c"} Mar 18 18:06:47 crc kubenswrapper[5008]: I0318 18:06:47.196374 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564284-6tsrt" Mar 18 18:06:47 crc kubenswrapper[5008]: I0318 18:06:47.196377 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="005063dad0bdd4186802503e453f79d06cce557cc0ca364ed8c98b6d1f00b40c" Mar 18 18:06:48 crc kubenswrapper[5008]: I0318 18:06:48.214871 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerID="190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c" exitCode=0 Mar 18 18:06:48 crc kubenswrapper[5008]: I0318 18:06:48.221392 5008 generic.go:334] "Generic (PLEG): container finished" podID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerID="501fd843db8c064a0a0ed6640f8fbe108a19100b544d4f2fdae8eb931e22b264" exitCode=0 Mar 18 18:06:48 crc kubenswrapper[5008]: I0318 18:06:48.232506 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2kl6" event={"ID":"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f","Type":"ContainerDied","Data":"190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c"} Mar 18 18:06:48 crc kubenswrapper[5008]: I0318 18:06:48.232625 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvkpk" event={"ID":"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6","Type":"ContainerDied","Data":"501fd843db8c064a0a0ed6640f8fbe108a19100b544d4f2fdae8eb931e22b264"} Mar 18 18:06:49 crc kubenswrapper[5008]: I0318 18:06:49.234995 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2kl6" event={"ID":"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f","Type":"ContainerStarted","Data":"9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9"} Mar 18 18:06:49 crc kubenswrapper[5008]: I0318 18:06:49.237002 5008 generic.go:334] "Generic (PLEG): container finished" podID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerID="c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f" exitCode=0 Mar 18 18:06:49 crc kubenswrapper[5008]: I0318 18:06:49.237095 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nhgk" event={"ID":"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3","Type":"ContainerDied","Data":"c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f"} Mar 18 18:06:49 crc kubenswrapper[5008]: I0318 18:06:49.247732 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvkpk" event={"ID":"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6","Type":"ContainerStarted","Data":"faf26a0a94a592e2827953bea96c2a03603c0000202d1c837eb7da6c595d65b3"} Mar 18 18:06:49 crc kubenswrapper[5008]: I0318 18:06:49.263515 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d2kl6" podStartSLOduration=3.636869813 podStartE2EDuration="54.26349202s" podCreationTimestamp="2026-03-18 18:05:55 +0000 UTC" firstStartedPulling="2026-03-18 18:05:58.035832799 +0000 UTC m=+214.555305878" lastFinishedPulling="2026-03-18 18:06:48.662455006 +0000 UTC m=+265.181928085" observedRunningTime="2026-03-18 18:06:49.260904402 +0000 UTC m=+265.780377521" watchObservedRunningTime="2026-03-18 18:06:49.26349202 +0000 UTC m=+265.782965109" Mar 18 18:06:49 crc kubenswrapper[5008]: I0318 18:06:49.315124 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dvkpk" podStartSLOduration=3.991373893 podStartE2EDuration="51.315094333s" podCreationTimestamp="2026-03-18 18:05:58 +0000 UTC" firstStartedPulling="2026-03-18 18:06:01.412100301 +0000 UTC m=+217.931573380" lastFinishedPulling="2026-03-18 18:06:48.735820711 +0000 UTC m=+265.255293820" observedRunningTime="2026-03-18 18:06:49.307920305 +0000 UTC m=+265.827393414" watchObservedRunningTime="2026-03-18 18:06:49.315094333 +0000 UTC m=+265.834567452" Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.257581 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nhgk" event={"ID":"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3","Type":"ContainerStarted","Data":"4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a"} Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.259704 5008 generic.go:334] "Generic (PLEG): container finished" podID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerID="194e3d640c40bab5e588c8cf75e4e4e3a5e9299c84f6d0cf8aa3174472792acb" exitCode=0 Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.259784 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krh4d" event={"ID":"e03ee689-ed4f-4b64-9e4a-4d6febd71716","Type":"ContainerDied","Data":"194e3d640c40bab5e588c8cf75e4e4e3a5e9299c84f6d0cf8aa3174472792acb"} Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.263226 5008 generic.go:334] "Generic (PLEG): container finished" podID="420d2432-4da0-4be3-8489-56c06a682e03" containerID="2ae718b181bfb756e9349fe749313e583b1d1f9cbb4c77638751811927089977" exitCode=0 Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.263276 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xth4j" event={"ID":"420d2432-4da0-4be3-8489-56c06a682e03","Type":"ContainerDied","Data":"2ae718b181bfb756e9349fe749313e583b1d1f9cbb4c77638751811927089977"} Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.268626 5008 generic.go:334] "Generic (PLEG): container finished" podID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerID="0030a92821f73c5fd1b21ea5fb458ea355df7bcc8a98688d1f9376b6858cee5f" exitCode=0 Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.268779 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fln9j" event={"ID":"d548a7a8-a808-45d3-91be-b0e9242383ec","Type":"ContainerDied","Data":"0030a92821f73c5fd1b21ea5fb458ea355df7bcc8a98688d1f9376b6858cee5f"} Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.279646 5008 generic.go:334] "Generic (PLEG): container finished" podID="9209339f-be23-444c-a635-04920e6a0cf6" containerID="6adb0576c5913e8c7712e8b51d776ae31be24f4483671b3746847b27180443e0" exitCode=0 Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.279703 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b59s5" event={"ID":"9209339f-be23-444c-a635-04920e6a0cf6","Type":"ContainerDied","Data":"6adb0576c5913e8c7712e8b51d776ae31be24f4483671b3746847b27180443e0"} Mar 18 18:06:50 crc kubenswrapper[5008]: I0318 18:06:50.293462 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5nhgk" podStartSLOduration=2.70154965 podStartE2EDuration="52.29343679s" podCreationTimestamp="2026-03-18 18:05:58 +0000 UTC" firstStartedPulling="2026-03-18 18:06:00.355390105 +0000 UTC m=+216.874863184" lastFinishedPulling="2026-03-18 18:06:49.947277235 +0000 UTC m=+266.466750324" observedRunningTime="2026-03-18 18:06:50.290136163 +0000 UTC m=+266.809609242" watchObservedRunningTime="2026-03-18 18:06:50.29343679 +0000 UTC m=+266.812909879" Mar 18 18:06:51 crc kubenswrapper[5008]: I0318 18:06:51.299319 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xth4j" event={"ID":"420d2432-4da0-4be3-8489-56c06a682e03","Type":"ContainerStarted","Data":"1cbac443129b9336bfcf92ffdb3a27b95779323162385c2b5960f203fd7ce80b"} Mar 18 18:06:51 crc kubenswrapper[5008]: I0318 18:06:51.303473 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fln9j" event={"ID":"d548a7a8-a808-45d3-91be-b0e9242383ec","Type":"ContainerStarted","Data":"f8c1537eda52ea5882193d2c7e5a77551b13a838fcfa0e680aa1d3eed9ed9656"} Mar 18 18:06:51 crc kubenswrapper[5008]: I0318 18:06:51.315096 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b59s5" event={"ID":"9209339f-be23-444c-a635-04920e6a0cf6","Type":"ContainerStarted","Data":"2ce79d08f9d51b90cdf84677ccf94509cc2f99e596171e28bb256d4623a0adc1"} Mar 18 18:06:51 crc kubenswrapper[5008]: I0318 18:06:51.325201 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xth4j" podStartSLOduration=3.340492846 podStartE2EDuration="57.325173288s" podCreationTimestamp="2026-03-18 18:05:54 +0000 UTC" firstStartedPulling="2026-03-18 18:05:56.938978488 +0000 UTC m=+213.458451567" lastFinishedPulling="2026-03-18 18:06:50.92365893 +0000 UTC m=+267.443132009" observedRunningTime="2026-03-18 18:06:51.322427286 +0000 UTC m=+267.841900375" watchObservedRunningTime="2026-03-18 18:06:51.325173288 +0000 UTC m=+267.844646367" Mar 18 18:06:51 crc kubenswrapper[5008]: I0318 18:06:51.346299 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b59s5" podStartSLOduration=3.667436528 podStartE2EDuration="56.346281532s" podCreationTimestamp="2026-03-18 18:05:55 +0000 UTC" firstStartedPulling="2026-03-18 18:05:58.055033295 +0000 UTC m=+214.574506374" lastFinishedPulling="2026-03-18 18:06:50.733878299 +0000 UTC m=+267.253351378" observedRunningTime="2026-03-18 18:06:51.34619855 +0000 UTC m=+267.865671629" watchObservedRunningTime="2026-03-18 18:06:51.346281532 +0000 UTC m=+267.865754611" Mar 18 18:06:51 crc kubenswrapper[5008]: I0318 18:06:51.375646 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fln9j" podStartSLOduration=3.2662878859999998 podStartE2EDuration="56.37554743s" podCreationTimestamp="2026-03-18 18:05:55 +0000 UTC" firstStartedPulling="2026-03-18 18:05:58.008895151 +0000 UTC m=+214.528368230" lastFinishedPulling="2026-03-18 18:06:51.118154695 +0000 UTC m=+267.637627774" observedRunningTime="2026-03-18 18:06:51.368047753 +0000 UTC m=+267.887520832" watchObservedRunningTime="2026-03-18 18:06:51.37554743 +0000 UTC m=+267.895020509" Mar 18 18:06:52 crc kubenswrapper[5008]: I0318 18:06:52.327751 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krh4d" event={"ID":"e03ee689-ed4f-4b64-9e4a-4d6febd71716","Type":"ContainerStarted","Data":"b5a2e21b5861f8a51088588866da2565d9ca3c8e2ad7d1cccc868592d5f82006"} Mar 18 18:06:52 crc kubenswrapper[5008]: I0318 18:06:52.330888 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrtz8" event={"ID":"5d041acc-48d2-4f2f-896f-94893b9ff41f","Type":"ContainerStarted","Data":"83d7470e7ce2ee42c888f13b5447487b65b48243a89b4b9f0607048c0a6ad1ac"} Mar 18 18:06:52 crc kubenswrapper[5008]: I0318 18:06:52.363662 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krh4d" podStartSLOduration=3.3003509859999998 podStartE2EDuration="55.363630602s" podCreationTimestamp="2026-03-18 18:05:57 +0000 UTC" firstStartedPulling="2026-03-18 18:05:59.162251997 +0000 UTC m=+215.681725076" lastFinishedPulling="2026-03-18 18:06:51.225531613 +0000 UTC m=+267.745004692" observedRunningTime="2026-03-18 18:06:52.362200144 +0000 UTC m=+268.881673223" watchObservedRunningTime="2026-03-18 18:06:52.363630602 +0000 UTC m=+268.883103681" Mar 18 18:06:53 crc kubenswrapper[5008]: I0318 18:06:53.340583 5008 generic.go:334] "Generic (PLEG): container finished" podID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerID="83d7470e7ce2ee42c888f13b5447487b65b48243a89b4b9f0607048c0a6ad1ac" exitCode=0 Mar 18 18:06:53 crc kubenswrapper[5008]: I0318 18:06:53.340668 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrtz8" event={"ID":"5d041acc-48d2-4f2f-896f-94893b9ff41f","Type":"ContainerDied","Data":"83d7470e7ce2ee42c888f13b5447487b65b48243a89b4b9f0607048c0a6ad1ac"} Mar 18 18:06:54 crc kubenswrapper[5008]: I0318 18:06:54.460117 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:06:54 crc kubenswrapper[5008]: I0318 18:06:54.460600 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:06:54 crc kubenswrapper[5008]: I0318 18:06:54.460661 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:06:54 crc kubenswrapper[5008]: I0318 18:06:54.461292 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:06:54 crc kubenswrapper[5008]: I0318 18:06:54.461360 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7" gracePeriod=600 Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.287528 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.288596 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.355613 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7" exitCode=0 Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.355655 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7"} Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.355680 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"265c93bd38176e028a3b20d735aef8eb6b45124abbc855b3703820d202fa1f53"} Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.457409 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.457519 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.555918 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.557700 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.616964 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.617020 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.660602 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.985090 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:06:55 crc kubenswrapper[5008]: I0318 18:06:55.985149 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:06:56 crc kubenswrapper[5008]: I0318 18:06:56.030371 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:06:56 crc kubenswrapper[5008]: I0318 18:06:56.405386 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:06:56 crc kubenswrapper[5008]: I0318 18:06:56.412160 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:06:56 crc kubenswrapper[5008]: I0318 18:06:56.412841 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.373175 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrtz8" event={"ID":"5d041acc-48d2-4f2f-896f-94893b9ff41f","Type":"ContainerStarted","Data":"5805c353901b2fa8e9bbcdf9c04063f82345b79b1305fbee5410a5bfa21d3062"} Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.406327 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vrtz8" podStartSLOduration=3.700954803 podStartE2EDuration="1m0.406292534s" podCreationTimestamp="2026-03-18 18:05:57 +0000 UTC" firstStartedPulling="2026-03-18 18:06:00.320363084 +0000 UTC m=+216.839836163" lastFinishedPulling="2026-03-18 18:06:57.025700805 +0000 UTC m=+273.545173894" observedRunningTime="2026-03-18 18:06:57.400346568 +0000 UTC m=+273.919819657" watchObservedRunningTime="2026-03-18 18:06:57.406292534 +0000 UTC m=+273.925765643" Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.542348 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.542450 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.775096 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.775161 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.841243 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.933399 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d545d5c54-8g2kl"] Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.933629 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" podUID="9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" containerName="controller-manager" containerID="cri-o://835f9210493eb9d6f3f3da8828233eaef4fa37e69eb615f75b66be6d3fe5b049" gracePeriod=30 Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.953660 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4"] Mar 18 18:06:57 crc kubenswrapper[5008]: I0318 18:06:57.953923 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" podUID="a7ebf070-7e6d-4215-afc9-6f7c7df4a442" containerName="route-controller-manager" containerID="cri-o://40ebc4abc0cdb56219f4a9b24c8b35056bb33b7461ba40ce098252a0e6f14681" gracePeriod=30 Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.380305 5008 generic.go:334] "Generic (PLEG): container finished" podID="a7ebf070-7e6d-4215-afc9-6f7c7df4a442" containerID="40ebc4abc0cdb56219f4a9b24c8b35056bb33b7461ba40ce098252a0e6f14681" exitCode=0 Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.380493 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" event={"ID":"a7ebf070-7e6d-4215-afc9-6f7c7df4a442","Type":"ContainerDied","Data":"40ebc4abc0cdb56219f4a9b24c8b35056bb33b7461ba40ce098252a0e6f14681"} Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.381843 5008 generic.go:334] "Generic (PLEG): container finished" podID="9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" containerID="835f9210493eb9d6f3f3da8828233eaef4fa37e69eb615f75b66be6d3fe5b049" exitCode=0 Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.382335 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" event={"ID":"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5","Type":"ContainerDied","Data":"835f9210493eb9d6f3f3da8828233eaef4fa37e69eb615f75b66be6d3fe5b049"} Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.440502 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.448178 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.448244 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.483673 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.484888 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.497573 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.581035 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqcf5\" (UniqueName: \"kubernetes.io/projected/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-kube-api-access-nqcf5\") pod \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.581110 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg644\" (UniqueName: \"kubernetes.io/projected/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-kube-api-access-sg644\") pod \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.581173 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-config\") pod \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.581211 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-config\") pod \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.581267 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-proxy-ca-bundles\") pod \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.581285 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-serving-cert\") pod \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.581985 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-client-ca\") pod \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\" (UID: \"a7ebf070-7e6d-4215-afc9-6f7c7df4a442\") " Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582027 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-client-ca\") pod \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582045 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-serving-cert\") pod \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\" (UID: \"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5\") " Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.581949 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-config" (OuterVolumeSpecName: "config") pod "a7ebf070-7e6d-4215-afc9-6f7c7df4a442" (UID: "a7ebf070-7e6d-4215-afc9-6f7c7df4a442"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582320 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-client-ca" (OuterVolumeSpecName: "client-ca") pod "a7ebf070-7e6d-4215-afc9-6f7c7df4a442" (UID: "a7ebf070-7e6d-4215-afc9-6f7c7df4a442"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582064 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" (UID: "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582096 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-config" (OuterVolumeSpecName: "config") pod "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" (UID: "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582374 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" (UID: "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582602 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582615 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582624 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582633 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.582642 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.586270 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-kube-api-access-nqcf5" (OuterVolumeSpecName: "kube-api-access-nqcf5") pod "a7ebf070-7e6d-4215-afc9-6f7c7df4a442" (UID: "a7ebf070-7e6d-4215-afc9-6f7c7df4a442"). InnerVolumeSpecName "kube-api-access-nqcf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.586372 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-kube-api-access-sg644" (OuterVolumeSpecName: "kube-api-access-sg644") pod "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" (UID: "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5"). InnerVolumeSpecName "kube-api-access-sg644". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.586826 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" (UID: "9d879744-4d48-4e8d-b5ab-a3e886cc9ae5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.588713 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a7ebf070-7e6d-4215-afc9-6f7c7df4a442" (UID: "a7ebf070-7e6d-4215-afc9-6f7c7df4a442"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.604882 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-vrtz8" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerName="registry-server" probeResult="failure" output=< Mar 18 18:06:58 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 18:06:58 crc kubenswrapper[5008]: > Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.683483 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.683522 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.683534 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqcf5\" (UniqueName: \"kubernetes.io/projected/a7ebf070-7e6d-4215-afc9-6f7c7df4a442-kube-api-access-nqcf5\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.683545 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg644\" (UniqueName: \"kubernetes.io/projected/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5-kube-api-access-sg644\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.722714 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d2kl6"] Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.723064 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d2kl6" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerName="registry-server" containerID="cri-o://9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9" gracePeriod=2 Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.830210 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.830674 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:06:58 crc kubenswrapper[5008]: I0318 18:06:58.873420 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.146010 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.289763 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-utilities\") pod \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.289877 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk5n2\" (UniqueName: \"kubernetes.io/projected/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-kube-api-access-gk5n2\") pod \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.289944 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-catalog-content\") pod \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\" (UID: \"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f\") " Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.290478 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-utilities" (OuterVolumeSpecName: "utilities") pod "f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" (UID: "f4f135d3-9f42-4d08-bfd2-71aa61f5e01f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.292501 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.295541 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-kube-api-access-gk5n2" (OuterVolumeSpecName: "kube-api-access-gk5n2") pod "f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" (UID: "f4f135d3-9f42-4d08-bfd2-71aa61f5e01f"). InnerVolumeSpecName "kube-api-access-gk5n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.348694 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg"] Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.349028 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5424805e-15fc-4424-8942-93f7095e148b" containerName="oc" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349045 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="5424805e-15fc-4424-8942-93f7095e148b" containerName="oc" Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.349061 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" containerName="controller-manager" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349070 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" containerName="controller-manager" Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.349083 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerName="registry-server" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349091 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerName="registry-server" Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.349104 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d04574-0631-403a-8bf5-4127787463d7" containerName="oc" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349113 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d04574-0631-403a-8bf5-4127787463d7" containerName="oc" Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.349132 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerName="extract-content" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349140 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerName="extract-content" Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.349152 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ebf070-7e6d-4215-afc9-6f7c7df4a442" containerName="route-controller-manager" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349162 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ebf070-7e6d-4215-afc9-6f7c7df4a442" containerName="route-controller-manager" Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.349172 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerName="extract-utilities" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349180 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerName="extract-utilities" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349295 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d04574-0631-403a-8bf5-4127787463d7" containerName="oc" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349315 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" containerName="controller-manager" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349327 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="5424805e-15fc-4424-8942-93f7095e148b" containerName="oc" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349336 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ebf070-7e6d-4215-afc9-6f7c7df4a442" containerName="route-controller-manager" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349346 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerName="registry-server" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.349834 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.354416 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-755ccc5586-bsnx5"] Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.355295 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.361020 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" (UID: "f4f135d3-9f42-4d08-bfd2-71aa61f5e01f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.364580 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg"] Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.369280 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-755ccc5586-bsnx5"] Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.407159 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk5n2\" (UniqueName: \"kubernetes.io/projected/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-kube-api-access-gk5n2\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.407188 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.413415 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.413426 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d545d5c54-8g2kl" event={"ID":"9d879744-4d48-4e8d-b5ab-a3e886cc9ae5","Type":"ContainerDied","Data":"13708c303caccff3b406a4ce4a4b5e030da08c8d22f3528a3e41a641673aaffe"} Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.413506 5008 scope.go:117] "RemoveContainer" containerID="835f9210493eb9d6f3f3da8828233eaef4fa37e69eb615f75b66be6d3fe5b049" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.415247 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.415238 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4" event={"ID":"a7ebf070-7e6d-4215-afc9-6f7c7df4a442","Type":"ContainerDied","Data":"333fcadb0f4a3d32811243a75f4a9cbd20747038abd6dbeed972327e5edcd4be"} Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.418833 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" containerID="9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9" exitCode=0 Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.418935 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2kl6" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.418987 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2kl6" event={"ID":"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f","Type":"ContainerDied","Data":"9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9"} Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.419006 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2kl6" event={"ID":"f4f135d3-9f42-4d08-bfd2-71aa61f5e01f","Type":"ContainerDied","Data":"8216fee027d5a797a247c9874c123cc6342a9d483894ed930845af810d844216"} Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.445883 5008 scope.go:117] "RemoveContainer" containerID="40ebc4abc0cdb56219f4a9b24c8b35056bb33b7461ba40ce098252a0e6f14681" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.463802 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d545d5c54-8g2kl"] Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.465591 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.469470 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d545d5c54-8g2kl"] Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.477737 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4"] Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.478773 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.478869 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85b9df6bdf-jxwl4"] Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.487527 5008 scope.go:117] "RemoveContainer" containerID="9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.488387 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d2kl6"] Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.491993 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d2kl6"] Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.513037 5008 scope.go:117] "RemoveContainer" containerID="190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.519003 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de4988b0-2070-45c2-be25-e64b8fe41965-serving-cert\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.519076 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-config\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.519114 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-proxy-ca-bundles\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.519147 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-client-ca\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.519188 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-client-ca\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.519261 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-config\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.519295 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqm52\" (UniqueName: \"kubernetes.io/projected/2a1a035c-088e-454b-97cf-c5db6131aa2d-kube-api-access-mqm52\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.519334 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a1a035c-088e-454b-97cf-c5db6131aa2d-serving-cert\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.519351 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r88kd\" (UniqueName: \"kubernetes.io/projected/de4988b0-2070-45c2-be25-e64b8fe41965-kube-api-access-r88kd\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.551071 5008 scope.go:117] "RemoveContainer" containerID="87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.565440 5008 scope.go:117] "RemoveContainer" containerID="9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9" Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.565876 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9\": container with ID starting with 9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9 not found: ID does not exist" containerID="9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.565923 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9"} err="failed to get container status \"9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9\": rpc error: code = NotFound desc = could not find container \"9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9\": container with ID starting with 9cc2da88e46912441366c2c7211ed8908ccfdf5ad1cada51732f328c8eef0bf9 not found: ID does not exist" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.565954 5008 scope.go:117] "RemoveContainer" containerID="190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c" Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.566359 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c\": container with ID starting with 190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c not found: ID does not exist" containerID="190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.566390 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c"} err="failed to get container status \"190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c\": rpc error: code = NotFound desc = could not find container \"190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c\": container with ID starting with 190c891531b85bebe1625006bf2b3b62ec24ba403ee169dfc6cc161766ddc00c not found: ID does not exist" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.566412 5008 scope.go:117] "RemoveContainer" containerID="87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518" Mar 18 18:06:59 crc kubenswrapper[5008]: E0318 18:06:59.566767 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518\": container with ID starting with 87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518 not found: ID does not exist" containerID="87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.566816 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518"} err="failed to get container status \"87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518\": rpc error: code = NotFound desc = could not find container \"87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518\": container with ID starting with 87b3e9832a89e9c26a806da7aec0de7854fde850dd4858d84b8caa8bfa24c518 not found: ID does not exist" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.625223 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a1a035c-088e-454b-97cf-c5db6131aa2d-serving-cert\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.625311 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r88kd\" (UniqueName: \"kubernetes.io/projected/de4988b0-2070-45c2-be25-e64b8fe41965-kube-api-access-r88kd\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.625457 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de4988b0-2070-45c2-be25-e64b8fe41965-serving-cert\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.625497 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-config\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.625536 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-proxy-ca-bundles\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.625586 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-client-ca\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.625610 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-client-ca\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.625691 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-config\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.625733 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqm52\" (UniqueName: \"kubernetes.io/projected/2a1a035c-088e-454b-97cf-c5db6131aa2d-kube-api-access-mqm52\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.626987 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-client-ca\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.627196 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-client-ca\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.627374 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-proxy-ca-bundles\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.627388 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-config\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.630082 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de4988b0-2070-45c2-be25-e64b8fe41965-serving-cert\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.630806 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-config\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.639278 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a1a035c-088e-454b-97cf-c5db6131aa2d-serving-cert\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.642870 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqm52\" (UniqueName: \"kubernetes.io/projected/2a1a035c-088e-454b-97cf-c5db6131aa2d-kube-api-access-mqm52\") pod \"route-controller-manager-7fbb75d756-z6xlg\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.643301 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r88kd\" (UniqueName: \"kubernetes.io/projected/de4988b0-2070-45c2-be25-e64b8fe41965-kube-api-access-r88kd\") pod \"controller-manager-755ccc5586-bsnx5\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.708346 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.733373 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:06:59 crc kubenswrapper[5008]: I0318 18:06:59.946731 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-755ccc5586-bsnx5"] Mar 18 18:06:59 crc kubenswrapper[5008]: W0318 18:06:59.952325 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde4988b0_2070_45c2_be25_e64b8fe41965.slice/crio-2a54b194957d3a5636b9425b3abe9a52f184fb976ade3e561d2538915547ebfd WatchSource:0}: Error finding container 2a54b194957d3a5636b9425b3abe9a52f184fb976ade3e561d2538915547ebfd: Status 404 returned error can't find the container with id 2a54b194957d3a5636b9425b3abe9a52f184fb976ade3e561d2538915547ebfd Mar 18 18:07:00 crc kubenswrapper[5008]: I0318 18:07:00.104470 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg"] Mar 18 18:07:00 crc kubenswrapper[5008]: I0318 18:07:00.205793 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d879744-4d48-4e8d-b5ab-a3e886cc9ae5" path="/var/lib/kubelet/pods/9d879744-4d48-4e8d-b5ab-a3e886cc9ae5/volumes" Mar 18 18:07:00 crc kubenswrapper[5008]: I0318 18:07:00.206752 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ebf070-7e6d-4215-afc9-6f7c7df4a442" path="/var/lib/kubelet/pods/a7ebf070-7e6d-4215-afc9-6f7c7df4a442/volumes" Mar 18 18:07:00 crc kubenswrapper[5008]: I0318 18:07:00.207344 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f135d3-9f42-4d08-bfd2-71aa61f5e01f" path="/var/lib/kubelet/pods/f4f135d3-9f42-4d08-bfd2-71aa61f5e01f/volumes" Mar 18 18:07:00 crc kubenswrapper[5008]: I0318 18:07:00.427532 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" event={"ID":"de4988b0-2070-45c2-be25-e64b8fe41965","Type":"ContainerStarted","Data":"2a54b194957d3a5636b9425b3abe9a52f184fb976ade3e561d2538915547ebfd"} Mar 18 18:07:00 crc kubenswrapper[5008]: I0318 18:07:00.430180 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" event={"ID":"2a1a035c-088e-454b-97cf-c5db6131aa2d","Type":"ContainerStarted","Data":"e8919674ba54bce1b6438be0a75a505f371456bd78ac1a683899395bf4f6d23e"} Mar 18 18:07:00 crc kubenswrapper[5008]: I0318 18:07:00.528500 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fln9j"] Mar 18 18:07:00 crc kubenswrapper[5008]: I0318 18:07:00.528935 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fln9j" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerName="registry-server" containerID="cri-o://f8c1537eda52ea5882193d2c7e5a77551b13a838fcfa0e680aa1d3eed9ed9656" gracePeriod=2 Mar 18 18:07:01 crc kubenswrapper[5008]: I0318 18:07:01.124830 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krh4d"] Mar 18 18:07:01 crc kubenswrapper[5008]: I0318 18:07:01.125309 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krh4d" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerName="registry-server" containerID="cri-o://b5a2e21b5861f8a51088588866da2565d9ca3c8e2ad7d1cccc868592d5f82006" gracePeriod=2 Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:01.438240 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" event={"ID":"2a1a035c-088e-454b-97cf-c5db6131aa2d","Type":"ContainerStarted","Data":"8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4"} Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:01.438575 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:01.439502 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" event={"ID":"de4988b0-2070-45c2-be25-e64b8fe41965","Type":"ContainerStarted","Data":"0f51e59d155f6c23cba548e1ae71e19633e8ada6dc30593a5e7c0d5952f1436b"} Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:01.439709 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:01.444329 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:01.444778 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:01.455764 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" podStartSLOduration=4.4557467 podStartE2EDuration="4.4557467s" podCreationTimestamp="2026-03-18 18:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:07:01.454434786 +0000 UTC m=+277.973907875" watchObservedRunningTime="2026-03-18 18:07:01.4557467 +0000 UTC m=+277.975219769" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.448097 5008 generic.go:334] "Generic (PLEG): container finished" podID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerID="b5a2e21b5861f8a51088588866da2565d9ca3c8e2ad7d1cccc868592d5f82006" exitCode=0 Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.448166 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krh4d" event={"ID":"e03ee689-ed4f-4b64-9e4a-4d6febd71716","Type":"ContainerDied","Data":"b5a2e21b5861f8a51088588866da2565d9ca3c8e2ad7d1cccc868592d5f82006"} Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.452005 5008 generic.go:334] "Generic (PLEG): container finished" podID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerID="f8c1537eda52ea5882193d2c7e5a77551b13a838fcfa0e680aa1d3eed9ed9656" exitCode=0 Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.452110 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fln9j" event={"ID":"d548a7a8-a808-45d3-91be-b0e9242383ec","Type":"ContainerDied","Data":"f8c1537eda52ea5882193d2c7e5a77551b13a838fcfa0e680aa1d3eed9ed9656"} Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.506953 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.524328 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" podStartSLOduration=5.524308084 podStartE2EDuration="5.524308084s" podCreationTimestamp="2026-03-18 18:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:07:01.496297174 +0000 UTC m=+278.015770263" watchObservedRunningTime="2026-03-18 18:07:02.524308084 +0000 UTC m=+279.043781173" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.672515 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-catalog-content\") pod \"d548a7a8-a808-45d3-91be-b0e9242383ec\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.672643 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj8sf\" (UniqueName: \"kubernetes.io/projected/d548a7a8-a808-45d3-91be-b0e9242383ec-kube-api-access-xj8sf\") pod \"d548a7a8-a808-45d3-91be-b0e9242383ec\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.672796 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-utilities\") pod \"d548a7a8-a808-45d3-91be-b0e9242383ec\" (UID: \"d548a7a8-a808-45d3-91be-b0e9242383ec\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.674254 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-utilities" (OuterVolumeSpecName: "utilities") pod "d548a7a8-a808-45d3-91be-b0e9242383ec" (UID: "d548a7a8-a808-45d3-91be-b0e9242383ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.683037 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d548a7a8-a808-45d3-91be-b0e9242383ec-kube-api-access-xj8sf" (OuterVolumeSpecName: "kube-api-access-xj8sf") pod "d548a7a8-a808-45d3-91be-b0e9242383ec" (UID: "d548a7a8-a808-45d3-91be-b0e9242383ec"). InnerVolumeSpecName "kube-api-access-xj8sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.775978 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj8sf\" (UniqueName: \"kubernetes.io/projected/d548a7a8-a808-45d3-91be-b0e9242383ec-kube-api-access-xj8sf\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.776384 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.782924 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d548a7a8-a808-45d3-91be-b0e9242383ec" (UID: "d548a7a8-a808-45d3-91be-b0e9242383ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.877360 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d548a7a8-a808-45d3-91be-b0e9242383ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.925547 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvkpk"] Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:02.925959 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dvkpk" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerName="registry-server" containerID="cri-o://faf26a0a94a592e2827953bea96c2a03603c0000202d1c837eb7da6c595d65b3" gracePeriod=2 Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:03.464062 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fln9j" event={"ID":"d548a7a8-a808-45d3-91be-b0e9242383ec","Type":"ContainerDied","Data":"67a29946f2ec67e778e396ac154fff585ce8c10d144b941aaea87f29b903e5fb"} Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:03.464132 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fln9j" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:03.464167 5008 scope.go:117] "RemoveContainer" containerID="f8c1537eda52ea5882193d2c7e5a77551b13a838fcfa0e680aa1d3eed9ed9656" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:03.786366 5008 scope.go:117] "RemoveContainer" containerID="0030a92821f73c5fd1b21ea5fb458ea355df7bcc8a98688d1f9376b6858cee5f" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:03.789758 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fln9j"] Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:03.796141 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fln9j"] Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:03.815126 5008 scope.go:117] "RemoveContainer" containerID="36296ca70c048458db65ba5c18c43d2e22dd69b6a62e7e52a8060cb55e66d8c2" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.211114 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" path="/var/lib/kubelet/pods/d548a7a8-a808-45d3-91be-b0e9242383ec/volumes" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.407936 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" podUID="1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" containerName="oauth-openshift" containerID="cri-o://935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747" gracePeriod=15 Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.431764 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.473374 5008 generic.go:334] "Generic (PLEG): container finished" podID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerID="faf26a0a94a592e2827953bea96c2a03603c0000202d1c837eb7da6c595d65b3" exitCode=0 Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.473448 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvkpk" event={"ID":"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6","Type":"ContainerDied","Data":"faf26a0a94a592e2827953bea96c2a03603c0000202d1c837eb7da6c595d65b3"} Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.476006 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krh4d" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.476003 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krh4d" event={"ID":"e03ee689-ed4f-4b64-9e4a-4d6febd71716","Type":"ContainerDied","Data":"6b463eb8cfae6531761f5c8ae9a01acfe7205f06dcfc7d76474c7fb3772d3094"} Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.476166 5008 scope.go:117] "RemoveContainer" containerID="b5a2e21b5861f8a51088588866da2565d9ca3c8e2ad7d1cccc868592d5f82006" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.499766 5008 scope.go:117] "RemoveContainer" containerID="194e3d640c40bab5e588c8cf75e4e4e3a5e9299c84f6d0cf8aa3174472792acb" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.537650 5008 scope.go:117] "RemoveContainer" containerID="54db9b507be2946e47cd81a9088947508d9f7c261e6964921e8dba990e27b6fe" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.617399 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lp5w\" (UniqueName: \"kubernetes.io/projected/e03ee689-ed4f-4b64-9e4a-4d6febd71716-kube-api-access-4lp5w\") pod \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.617542 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-utilities\") pod \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.617615 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-catalog-content\") pod \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\" (UID: \"e03ee689-ed4f-4b64-9e4a-4d6febd71716\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.618523 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-utilities" (OuterVolumeSpecName: "utilities") pod "e03ee689-ed4f-4b64-9e4a-4d6febd71716" (UID: "e03ee689-ed4f-4b64-9e4a-4d6febd71716"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.629853 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03ee689-ed4f-4b64-9e4a-4d6febd71716-kube-api-access-4lp5w" (OuterVolumeSpecName: "kube-api-access-4lp5w") pod "e03ee689-ed4f-4b64-9e4a-4d6febd71716" (UID: "e03ee689-ed4f-4b64-9e4a-4d6febd71716"). InnerVolumeSpecName "kube-api-access-4lp5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.639408 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.650224 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e03ee689-ed4f-4b64-9e4a-4d6febd71716" (UID: "e03ee689-ed4f-4b64-9e4a-4d6febd71716"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.719342 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.719384 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03ee689-ed4f-4b64-9e4a-4d6febd71716-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.719399 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lp5w\" (UniqueName: \"kubernetes.io/projected/e03ee689-ed4f-4b64-9e4a-4d6febd71716-kube-api-access-4lp5w\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.819691 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-utilities\") pod \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.820508 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-catalog-content\") pod \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.820537 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj4hb\" (UniqueName: \"kubernetes.io/projected/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-kube-api-access-jj4hb\") pod \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\" (UID: \"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.820764 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.820944 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-utilities" (OuterVolumeSpecName: "utilities") pod "8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" (UID: "8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.831943 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-kube-api-access-jj4hb" (OuterVolumeSpecName: "kube-api-access-jj4hb") pod "8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" (UID: "8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6"). InnerVolumeSpecName "kube-api-access-jj4hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.835819 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krh4d"] Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.839336 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krh4d"] Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.920957 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-ocp-branding-template\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921349 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-policies\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921379 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhz2l\" (UniqueName: \"kubernetes.io/projected/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-kube-api-access-fhz2l\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921402 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-service-ca\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921425 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-trusted-ca-bundle\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921451 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-provider-selection\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921479 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-session\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921513 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-login\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921570 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-error\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921602 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-router-certs\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921624 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-serving-cert\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921649 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-dir\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921673 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-cliconfig\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921695 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-idp-0-file-data\") pod \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\" (UID: \"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7\") " Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921884 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj4hb\" (UniqueName: \"kubernetes.io/projected/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-kube-api-access-jj4hb\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921901 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.921958 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.922014 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.922288 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.922545 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.923355 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.924367 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.929190 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-kube-api-access-fhz2l" (OuterVolumeSpecName: "kube-api-access-fhz2l") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "kube-api-access-fhz2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.933092 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.933302 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.933455 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.933967 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.934339 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.935299 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.941244 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" (UID: "1c2ce672-3aa9-45a2-ab2e-68c4c696bce7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:04 crc kubenswrapper[5008]: I0318 18:07:04.983476 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" (UID: "8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022638 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhz2l\" (UniqueName: \"kubernetes.io/projected/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-kube-api-access-fhz2l\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022674 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022685 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022695 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022707 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022720 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022729 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022738 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022749 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022757 5008 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022766 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022775 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022785 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022794 5008 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.022803 5008 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.352971 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.490857 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvkpk" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.490866 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvkpk" event={"ID":"8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6","Type":"ContainerDied","Data":"6e0c1cf05197aed9ab817cc6eee791becd2bb8bde27ab4b09de7f73f3fad6d08"} Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.491167 5008 scope.go:117] "RemoveContainer" containerID="faf26a0a94a592e2827953bea96c2a03603c0000202d1c837eb7da6c595d65b3" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.500156 5008 generic.go:334] "Generic (PLEG): container finished" podID="1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" containerID="935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747" exitCode=0 Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.500193 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" event={"ID":"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7","Type":"ContainerDied","Data":"935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747"} Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.500219 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" event={"ID":"1c2ce672-3aa9-45a2-ab2e-68c4c696bce7","Type":"ContainerDied","Data":"c4a1881be879c42a40aa76510b9acbe29c170e15f6e4368e590e2ad54410f9fe"} Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.500246 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-z9ssp" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.530905 5008 scope.go:117] "RemoveContainer" containerID="501fd843db8c064a0a0ed6640f8fbe108a19100b544d4f2fdae8eb931e22b264" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.558170 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvkpk"] Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.565284 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dvkpk"] Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.568802 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-z9ssp"] Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.570660 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-z9ssp"] Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.578212 5008 scope.go:117] "RemoveContainer" containerID="8e9604c39970d34cca722cffe1a698fa864ecfdb63481c222b935e2449f77f39" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.601630 5008 scope.go:117] "RemoveContainer" containerID="935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.619133 5008 scope.go:117] "RemoveContainer" containerID="935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747" Mar 18 18:07:05 crc kubenswrapper[5008]: E0318 18:07:05.619442 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747\": container with ID starting with 935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747 not found: ID does not exist" containerID="935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747" Mar 18 18:07:05 crc kubenswrapper[5008]: I0318 18:07:05.619480 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747"} err="failed to get container status \"935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747\": rpc error: code = NotFound desc = could not find container \"935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747\": container with ID starting with 935c0a1b327a15c6ea32f61ea5cfa918d9fa1acdee967cf9748e17924b8bd747 not found: ID does not exist" Mar 18 18:07:06 crc kubenswrapper[5008]: I0318 18:07:06.214196 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" path="/var/lib/kubelet/pods/1c2ce672-3aa9-45a2-ab2e-68c4c696bce7/volumes" Mar 18 18:07:06 crc kubenswrapper[5008]: I0318 18:07:06.215330 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" path="/var/lib/kubelet/pods/8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6/volumes" Mar 18 18:07:06 crc kubenswrapper[5008]: I0318 18:07:06.216293 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" path="/var/lib/kubelet/pods/e03ee689-ed4f-4b64-9e4a-4d6febd71716/volumes" Mar 18 18:07:07 crc kubenswrapper[5008]: I0318 18:07:07.583435 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:07:07 crc kubenswrapper[5008]: I0318 18:07:07.619224 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.765858 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-754dc54bdd-59z87"] Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766152 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerName="registry-server" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766172 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerName="registry-server" Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766189 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerName="registry-server" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766200 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerName="registry-server" Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766217 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerName="registry-server" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766228 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerName="registry-server" Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766250 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerName="extract-utilities" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766261 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerName="extract-utilities" Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766272 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerName="extract-content" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766284 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerName="extract-content" Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766298 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerName="extract-content" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766308 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerName="extract-content" Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766322 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerName="extract-content" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766332 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerName="extract-content" Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766343 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerName="extract-utilities" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766354 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerName="extract-utilities" Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766369 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerName="extract-utilities" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766380 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerName="extract-utilities" Mar 18 18:07:08 crc kubenswrapper[5008]: E0318 18:07:08.766401 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" containerName="oauth-openshift" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766411 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" containerName="oauth-openshift" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766605 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c2ce672-3aa9-45a2-ab2e-68c4c696bce7" containerName="oauth-openshift" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766624 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03ee689-ed4f-4b64-9e4a-4d6febd71716" containerName="registry-server" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766640 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d548a7a8-a808-45d3-91be-b0e9242383ec" containerName="registry-server" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.766655 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f66c6d4-af17-4f00-a4f7-ee1a0ce0e7a6" containerName="registry-server" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.767262 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.770959 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.771881 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.771963 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.772264 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.772471 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.772785 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.773660 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.773664 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.774920 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.775125 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.775176 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.776056 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.776924 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-audit-policies\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777016 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777082 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777178 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-serving-cert\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777242 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-service-ca\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777261 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777282 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-template-error\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777361 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-template-login\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777457 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0d0f293-a18a-4c40-b925-547ad91e8f69-audit-dir\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777501 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-session\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777530 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-router-certs\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777593 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777629 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkjh\" (UniqueName: \"kubernetes.io/projected/e0d0f293-a18a-4c40-b925-547ad91e8f69-kube-api-access-8kkjh\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.777725 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-cliconfig\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.782178 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.790653 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.791201 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-754dc54bdd-59z87"] Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.796350 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.879482 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-serving-cert\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.880146 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-service-ca\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.880302 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.880400 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-template-error\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.880515 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-template-login\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.880698 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0d0f293-a18a-4c40-b925-547ad91e8f69-audit-dir\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.880837 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-session\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.880942 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-router-certs\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.881044 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.881149 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkjh\" (UniqueName: \"kubernetes.io/projected/e0d0f293-a18a-4c40-b925-547ad91e8f69-kube-api-access-8kkjh\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.881261 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-cliconfig\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.881388 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-audit-policies\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.881501 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.881620 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.883377 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-audit-policies\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.883502 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-service-ca\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.883817 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-cliconfig\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.884334 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.884383 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e0d0f293-a18a-4c40-b925-547ad91e8f69-audit-dir\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.888331 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-session\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.888913 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.889248 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-serving-cert\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.891874 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.892055 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-template-login\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.891943 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-system-router-certs\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.892395 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-template-error\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.892631 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e0d0f293-a18a-4c40-b925-547ad91e8f69-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:08 crc kubenswrapper[5008]: I0318 18:07:08.920376 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkjh\" (UniqueName: \"kubernetes.io/projected/e0d0f293-a18a-4c40-b925-547ad91e8f69-kube-api-access-8kkjh\") pod \"oauth-openshift-754dc54bdd-59z87\" (UID: \"e0d0f293-a18a-4c40-b925-547ad91e8f69\") " pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:09 crc kubenswrapper[5008]: I0318 18:07:09.093572 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:09 crc kubenswrapper[5008]: I0318 18:07:09.556708 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-754dc54bdd-59z87"] Mar 18 18:07:09 crc kubenswrapper[5008]: W0318 18:07:09.574715 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d0f293_a18a_4c40_b925_547ad91e8f69.slice/crio-d5f5975ba11065aa43946ebff4b1f91cf2466b082dbcfd03eec0b60d50c68bed WatchSource:0}: Error finding container d5f5975ba11065aa43946ebff4b1f91cf2466b082dbcfd03eec0b60d50c68bed: Status 404 returned error can't find the container with id d5f5975ba11065aa43946ebff4b1f91cf2466b082dbcfd03eec0b60d50c68bed Mar 18 18:07:10 crc kubenswrapper[5008]: I0318 18:07:10.535213 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" event={"ID":"e0d0f293-a18a-4c40-b925-547ad91e8f69","Type":"ContainerStarted","Data":"b0c4856fc6e149f46b1ced926ffd5079c061f8c996ad3ef15a66b70c5088614c"} Mar 18 18:07:10 crc kubenswrapper[5008]: I0318 18:07:10.535678 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" event={"ID":"e0d0f293-a18a-4c40-b925-547ad91e8f69","Type":"ContainerStarted","Data":"d5f5975ba11065aa43946ebff4b1f91cf2466b082dbcfd03eec0b60d50c68bed"} Mar 18 18:07:10 crc kubenswrapper[5008]: I0318 18:07:10.537245 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:10 crc kubenswrapper[5008]: I0318 18:07:10.566320 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" podStartSLOduration=31.566290183 podStartE2EDuration="31.566290183s" podCreationTimestamp="2026-03-18 18:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:07:10.554903995 +0000 UTC m=+287.074377114" watchObservedRunningTime="2026-03-18 18:07:10.566290183 +0000 UTC m=+287.085763302" Mar 18 18:07:10 crc kubenswrapper[5008]: I0318 18:07:10.856265 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-754dc54bdd-59z87" Mar 18 18:07:17 crc kubenswrapper[5008]: I0318 18:07:17.940457 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-755ccc5586-bsnx5"] Mar 18 18:07:17 crc kubenswrapper[5008]: I0318 18:07:17.942017 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" podUID="de4988b0-2070-45c2-be25-e64b8fe41965" containerName="controller-manager" containerID="cri-o://0f51e59d155f6c23cba548e1ae71e19633e8ada6dc30593a5e7c0d5952f1436b" gracePeriod=30 Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.030527 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg"] Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.030826 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" podUID="2a1a035c-088e-454b-97cf-c5db6131aa2d" containerName="route-controller-manager" containerID="cri-o://8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4" gracePeriod=30 Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.533575 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.582074 5008 generic.go:334] "Generic (PLEG): container finished" podID="2a1a035c-088e-454b-97cf-c5db6131aa2d" containerID="8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4" exitCode=0 Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.582140 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.582164 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" event={"ID":"2a1a035c-088e-454b-97cf-c5db6131aa2d","Type":"ContainerDied","Data":"8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4"} Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.582672 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg" event={"ID":"2a1a035c-088e-454b-97cf-c5db6131aa2d","Type":"ContainerDied","Data":"e8919674ba54bce1b6438be0a75a505f371456bd78ac1a683899395bf4f6d23e"} Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.582751 5008 scope.go:117] "RemoveContainer" containerID="8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.585209 5008 generic.go:334] "Generic (PLEG): container finished" podID="de4988b0-2070-45c2-be25-e64b8fe41965" containerID="0f51e59d155f6c23cba548e1ae71e19633e8ada6dc30593a5e7c0d5952f1436b" exitCode=0 Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.585268 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" event={"ID":"de4988b0-2070-45c2-be25-e64b8fe41965","Type":"ContainerDied","Data":"0f51e59d155f6c23cba548e1ae71e19633e8ada6dc30593a5e7c0d5952f1436b"} Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.585312 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" event={"ID":"de4988b0-2070-45c2-be25-e64b8fe41965","Type":"ContainerDied","Data":"2a54b194957d3a5636b9425b3abe9a52f184fb976ade3e561d2538915547ebfd"} Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.585334 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a54b194957d3a5636b9425b3abe9a52f184fb976ade3e561d2538915547ebfd" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.589190 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.604463 5008 scope.go:117] "RemoveContainer" containerID="8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4" Mar 18 18:07:18 crc kubenswrapper[5008]: E0318 18:07:18.605420 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4\": container with ID starting with 8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4 not found: ID does not exist" containerID="8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.605468 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4"} err="failed to get container status \"8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4\": rpc error: code = NotFound desc = could not find container \"8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4\": container with ID starting with 8dd2946974dc6646683582feae69235f88eadb74d23ac2c0f2f801d584fa86b4 not found: ID does not exist" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.657132 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de4988b0-2070-45c2-be25-e64b8fe41965-serving-cert\") pod \"de4988b0-2070-45c2-be25-e64b8fe41965\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.657254 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqm52\" (UniqueName: \"kubernetes.io/projected/2a1a035c-088e-454b-97cf-c5db6131aa2d-kube-api-access-mqm52\") pod \"2a1a035c-088e-454b-97cf-c5db6131aa2d\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.657311 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-client-ca\") pod \"2a1a035c-088e-454b-97cf-c5db6131aa2d\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.657364 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a1a035c-088e-454b-97cf-c5db6131aa2d-serving-cert\") pod \"2a1a035c-088e-454b-97cf-c5db6131aa2d\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.657434 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-proxy-ca-bundles\") pod \"de4988b0-2070-45c2-be25-e64b8fe41965\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.657507 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r88kd\" (UniqueName: \"kubernetes.io/projected/de4988b0-2070-45c2-be25-e64b8fe41965-kube-api-access-r88kd\") pod \"de4988b0-2070-45c2-be25-e64b8fe41965\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.657551 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-config\") pod \"2a1a035c-088e-454b-97cf-c5db6131aa2d\" (UID: \"2a1a035c-088e-454b-97cf-c5db6131aa2d\") " Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.658881 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-config" (OuterVolumeSpecName: "config") pod "2a1a035c-088e-454b-97cf-c5db6131aa2d" (UID: "2a1a035c-088e-454b-97cf-c5db6131aa2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.660126 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-client-ca\") pod \"de4988b0-2070-45c2-be25-e64b8fe41965\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.660194 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-config\") pod \"de4988b0-2070-45c2-be25-e64b8fe41965\" (UID: \"de4988b0-2070-45c2-be25-e64b8fe41965\") " Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.660866 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "de4988b0-2070-45c2-be25-e64b8fe41965" (UID: "de4988b0-2070-45c2-be25-e64b8fe41965"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.660959 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-client-ca" (OuterVolumeSpecName: "client-ca") pod "2a1a035c-088e-454b-97cf-c5db6131aa2d" (UID: "2a1a035c-088e-454b-97cf-c5db6131aa2d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.661143 5008 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.661234 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.661495 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-client-ca" (OuterVolumeSpecName: "client-ca") pod "de4988b0-2070-45c2-be25-e64b8fe41965" (UID: "de4988b0-2070-45c2-be25-e64b8fe41965"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.661536 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-config" (OuterVolumeSpecName: "config") pod "de4988b0-2070-45c2-be25-e64b8fe41965" (UID: "de4988b0-2070-45c2-be25-e64b8fe41965"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.665591 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1a035c-088e-454b-97cf-c5db6131aa2d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2a1a035c-088e-454b-97cf-c5db6131aa2d" (UID: "2a1a035c-088e-454b-97cf-c5db6131aa2d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.665593 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4988b0-2070-45c2-be25-e64b8fe41965-kube-api-access-r88kd" (OuterVolumeSpecName: "kube-api-access-r88kd") pod "de4988b0-2070-45c2-be25-e64b8fe41965" (UID: "de4988b0-2070-45c2-be25-e64b8fe41965"). InnerVolumeSpecName "kube-api-access-r88kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.667330 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4988b0-2070-45c2-be25-e64b8fe41965-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "de4988b0-2070-45c2-be25-e64b8fe41965" (UID: "de4988b0-2070-45c2-be25-e64b8fe41965"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.667339 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1a035c-088e-454b-97cf-c5db6131aa2d-kube-api-access-mqm52" (OuterVolumeSpecName: "kube-api-access-mqm52") pod "2a1a035c-088e-454b-97cf-c5db6131aa2d" (UID: "2a1a035c-088e-454b-97cf-c5db6131aa2d"). InnerVolumeSpecName "kube-api-access-mqm52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.762080 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r88kd\" (UniqueName: \"kubernetes.io/projected/de4988b0-2070-45c2-be25-e64b8fe41965-kube-api-access-r88kd\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.762114 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.762129 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de4988b0-2070-45c2-be25-e64b8fe41965-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.762140 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de4988b0-2070-45c2-be25-e64b8fe41965-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.762151 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqm52\" (UniqueName: \"kubernetes.io/projected/2a1a035c-088e-454b-97cf-c5db6131aa2d-kube-api-access-mqm52\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.762164 5008 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a1a035c-088e-454b-97cf-c5db6131aa2d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.762175 5008 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a1a035c-088e-454b-97cf-c5db6131aa2d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.917796 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg"] Mar 18 18:07:18 crc kubenswrapper[5008]: I0318 18:07:18.923874 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbb75d756-z6xlg"] Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.596903 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-755ccc5586-bsnx5" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.645888 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-755ccc5586-bsnx5"] Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.654628 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-755ccc5586-bsnx5"] Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.782040 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc"] Mar 18 18:07:19 crc kubenswrapper[5008]: E0318 18:07:19.782459 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4988b0-2070-45c2-be25-e64b8fe41965" containerName="controller-manager" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.782484 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4988b0-2070-45c2-be25-e64b8fe41965" containerName="controller-manager" Mar 18 18:07:19 crc kubenswrapper[5008]: E0318 18:07:19.782517 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1a035c-088e-454b-97cf-c5db6131aa2d" containerName="route-controller-manager" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.782531 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1a035c-088e-454b-97cf-c5db6131aa2d" containerName="route-controller-manager" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.782739 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1a035c-088e-454b-97cf-c5db6131aa2d" containerName="route-controller-manager" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.782769 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4988b0-2070-45c2-be25-e64b8fe41965" containerName="controller-manager" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.783391 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.785717 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q"] Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.786422 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.786491 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.786649 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.787316 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.787396 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.787324 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.787899 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.793381 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.796408 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.796903 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.796933 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.797174 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.797454 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.798356 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc"] Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.800945 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q"] Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.803709 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.878163 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88af367-2e33-43c5-8b3a-a9c3e31c621e-config\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.878272 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88af367-2e33-43c5-8b3a-a9c3e31c621e-serving-cert\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.878325 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74bd978f-541c-4f11-a7b4-042cce753f6d-client-ca\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.878381 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bd978f-541c-4f11-a7b4-042cce753f6d-config\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.878416 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b88af367-2e33-43c5-8b3a-a9c3e31c621e-client-ca\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.878483 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5wc2\" (UniqueName: \"kubernetes.io/projected/b88af367-2e33-43c5-8b3a-a9c3e31c621e-kube-api-access-z5wc2\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.878508 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-949gf\" (UniqueName: \"kubernetes.io/projected/74bd978f-541c-4f11-a7b4-042cce753f6d-kube-api-access-949gf\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.878566 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b88af367-2e33-43c5-8b3a-a9c3e31c621e-proxy-ca-bundles\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.878596 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74bd978f-541c-4f11-a7b4-042cce753f6d-serving-cert\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.979715 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88af367-2e33-43c5-8b3a-a9c3e31c621e-config\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.980001 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88af367-2e33-43c5-8b3a-a9c3e31c621e-serving-cert\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.980111 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74bd978f-541c-4f11-a7b4-042cce753f6d-client-ca\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.980235 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bd978f-541c-4f11-a7b4-042cce753f6d-config\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.980355 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b88af367-2e33-43c5-8b3a-a9c3e31c621e-client-ca\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.980519 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5wc2\" (UniqueName: \"kubernetes.io/projected/b88af367-2e33-43c5-8b3a-a9c3e31c621e-kube-api-access-z5wc2\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.980668 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-949gf\" (UniqueName: \"kubernetes.io/projected/74bd978f-541c-4f11-a7b4-042cce753f6d-kube-api-access-949gf\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.980789 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b88af367-2e33-43c5-8b3a-a9c3e31c621e-proxy-ca-bundles\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.980903 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74bd978f-541c-4f11-a7b4-042cce753f6d-serving-cert\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.981086 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88af367-2e33-43c5-8b3a-a9c3e31c621e-config\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.981219 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74bd978f-541c-4f11-a7b4-042cce753f6d-client-ca\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.981251 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b88af367-2e33-43c5-8b3a-a9c3e31c621e-client-ca\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.981906 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bd978f-541c-4f11-a7b4-042cce753f6d-config\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.982449 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b88af367-2e33-43c5-8b3a-a9c3e31c621e-proxy-ca-bundles\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.984722 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88af367-2e33-43c5-8b3a-a9c3e31c621e-serving-cert\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:19 crc kubenswrapper[5008]: I0318 18:07:19.991352 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74bd978f-541c-4f11-a7b4-042cce753f6d-serving-cert\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.007719 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-949gf\" (UniqueName: \"kubernetes.io/projected/74bd978f-541c-4f11-a7b4-042cce753f6d-kube-api-access-949gf\") pod \"route-controller-manager-68b5b9cbd4-zd5bc\" (UID: \"74bd978f-541c-4f11-a7b4-042cce753f6d\") " pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.015348 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5wc2\" (UniqueName: \"kubernetes.io/projected/b88af367-2e33-43c5-8b3a-a9c3e31c621e-kube-api-access-z5wc2\") pod \"controller-manager-7c8f5b88b7-hhg9q\" (UID: \"b88af367-2e33-43c5-8b3a-a9c3e31c621e\") " pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.108880 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.121233 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.206284 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1a035c-088e-454b-97cf-c5db6131aa2d" path="/var/lib/kubelet/pods/2a1a035c-088e-454b-97cf-c5db6131aa2d/volumes" Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.208289 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de4988b0-2070-45c2-be25-e64b8fe41965" path="/var/lib/kubelet/pods/de4988b0-2070-45c2-be25-e64b8fe41965/volumes" Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.373657 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc"] Mar 18 18:07:20 crc kubenswrapper[5008]: W0318 18:07:20.385152 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74bd978f_541c_4f11_a7b4_042cce753f6d.slice/crio-c9560e844a397192c0dc3f56e58fa86b578d77992f36844ec031802ef9fe9a79 WatchSource:0}: Error finding container c9560e844a397192c0dc3f56e58fa86b578d77992f36844ec031802ef9fe9a79: Status 404 returned error can't find the container with id c9560e844a397192c0dc3f56e58fa86b578d77992f36844ec031802ef9fe9a79 Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.550340 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q"] Mar 18 18:07:20 crc kubenswrapper[5008]: W0318 18:07:20.559140 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb88af367_2e33_43c5_8b3a_a9c3e31c621e.slice/crio-ad3953a287649a45cb885cce568f618dcf71cd0fdcdf2c4844b06880367ac3bd WatchSource:0}: Error finding container ad3953a287649a45cb885cce568f618dcf71cd0fdcdf2c4844b06880367ac3bd: Status 404 returned error can't find the container with id ad3953a287649a45cb885cce568f618dcf71cd0fdcdf2c4844b06880367ac3bd Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.604364 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" event={"ID":"74bd978f-541c-4f11-a7b4-042cce753f6d","Type":"ContainerStarted","Data":"3e2de9d3c0c5f428e09559d77773dfe7204e6557884c47b3ef79b68967442964"} Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.604432 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" event={"ID":"74bd978f-541c-4f11-a7b4-042cce753f6d","Type":"ContainerStarted","Data":"c9560e844a397192c0dc3f56e58fa86b578d77992f36844ec031802ef9fe9a79"} Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.606678 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.608127 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" event={"ID":"b88af367-2e33-43c5-8b3a-a9c3e31c621e","Type":"ContainerStarted","Data":"ad3953a287649a45cb885cce568f618dcf71cd0fdcdf2c4844b06880367ac3bd"} Mar 18 18:07:20 crc kubenswrapper[5008]: I0318 18:07:20.624863 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" podStartSLOduration=2.624840898 podStartE2EDuration="2.624840898s" podCreationTimestamp="2026-03-18 18:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:07:20.622601949 +0000 UTC m=+297.142075018" watchObservedRunningTime="2026-03-18 18:07:20.624840898 +0000 UTC m=+297.144313997" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.025336 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-68b5b9cbd4-zd5bc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.617985 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" event={"ID":"b88af367-2e33-43c5-8b3a-a9c3e31c621e","Type":"ContainerStarted","Data":"f8d3e9366bb7efa704575088af7fc3668a19f27b6e5e7a9ec413227e7b530230"} Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.618902 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.624116 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.638762 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c8f5b88b7-hhg9q" podStartSLOduration=4.638734336 podStartE2EDuration="4.638734336s" podCreationTimestamp="2026-03-18 18:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:07:21.634785843 +0000 UTC m=+298.154258932" watchObservedRunningTime="2026-03-18 18:07:21.638734336 +0000 UTC m=+298.158207435" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.673284 5008 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.674128 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.716627 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.746427 5008 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.746762 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5" gracePeriod=15 Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.746912 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237" gracePeriod=15 Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.746964 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04" gracePeriod=15 Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.747022 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7" gracePeriod=15 Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.747061 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414" gracePeriod=15 Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749021 5008 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749264 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749281 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749297 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749303 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749312 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749318 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749325 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749330 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749344 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749351 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749366 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749372 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749384 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749391 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749401 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749408 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749514 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749526 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749536 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749544 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749575 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749584 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749592 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749599 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749698 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749709 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: E0318 18:07:21.749720 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749727 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.749838 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.804188 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.804275 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.804333 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.804491 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.804535 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906063 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906104 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906134 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906190 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906211 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906237 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906252 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906289 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906316 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906204 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906366 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906401 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:21 crc kubenswrapper[5008]: I0318 18:07:21.906439 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.008129 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.008235 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.008252 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.008342 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.008434 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.008527 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.011701 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:07:22 crc kubenswrapper[5008]: W0318 18:07:22.035816 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6deddaf1893b234b7a7fb7a985c62f69cd19e729920ddeaf85bd72066a173ebe WatchSource:0}: Error finding container 6deddaf1893b234b7a7fb7a985c62f69cd19e729920ddeaf85bd72066a173ebe: Status 404 returned error can't find the container with id 6deddaf1893b234b7a7fb7a985c62f69cd19e729920ddeaf85bd72066a173ebe Mar 18 18:07:22 crc kubenswrapper[5008]: E0318 18:07:22.039135 5008 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e01c369eebd61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:07:22.038058337 +0000 UTC m=+298.557531456,LastTimestamp:2026-03-18 18:07:22.038058337 +0000 UTC m=+298.557531456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.634059 5008 generic.go:334] "Generic (PLEG): container finished" podID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" containerID="4445bddcdf5ab64c32dc31c58508ec659b76b545703ad1b1058e8322c481ebc5" exitCode=0 Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.634141 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"78c1c7b1-ecc0-4966-9be8-536fcd15335e","Type":"ContainerDied","Data":"4445bddcdf5ab64c32dc31c58508ec659b76b545703ad1b1058e8322c481ebc5"} Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.634938 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.635369 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.637586 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.639174 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.640843 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237" exitCode=0 Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.640881 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04" exitCode=0 Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.640895 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7" exitCode=0 Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.640907 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414" exitCode=2 Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.640906 5008 scope.go:117] "RemoveContainer" containerID="1c455b5c293355a2f7acbb17bde2d8584ee614b5dabf750e716aa2e180131960" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.643351 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf"} Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.643409 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6deddaf1893b234b7a7fb7a985c62f69cd19e729920ddeaf85bd72066a173ebe"} Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.643911 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:22 crc kubenswrapper[5008]: I0318 18:07:22.644303 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:23 crc kubenswrapper[5008]: I0318 18:07:23.657028 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 18:07:23 crc kubenswrapper[5008]: I0318 18:07:23.999077 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.000016 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.000506 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.119458 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.120221 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.120729 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.121035 5008 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.121403 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.154167 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kubelet-dir\") pod \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.154238 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-var-lock\") pod \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.154266 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kube-api-access\") pod \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\" (UID: \"78c1c7b1-ecc0-4966-9be8-536fcd15335e\") " Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.154293 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "78c1c7b1-ecc0-4966-9be8-536fcd15335e" (UID: "78c1c7b1-ecc0-4966-9be8-536fcd15335e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.154311 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-var-lock" (OuterVolumeSpecName: "var-lock") pod "78c1c7b1-ecc0-4966-9be8-536fcd15335e" (UID: "78c1c7b1-ecc0-4966-9be8-536fcd15335e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.154518 5008 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.154531 5008 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78c1c7b1-ecc0-4966-9be8-536fcd15335e-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.159887 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "78c1c7b1-ecc0-4966-9be8-536fcd15335e" (UID: "78c1c7b1-ecc0-4966-9be8-536fcd15335e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.203310 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.203857 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.204345 5008 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255152 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255193 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255245 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255294 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255322 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255434 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255547 5008 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255576 5008 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255585 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c1c7b1-ecc0-4966-9be8-536fcd15335e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.255596 5008 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.667757 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.672254 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.672674 5008 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5" exitCode=0 Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.672782 5008 scope.go:117] "RemoveContainer" containerID="ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.673365 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.673773 5008 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.674529 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.676273 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"78c1c7b1-ecc0-4966-9be8-536fcd15335e","Type":"ContainerDied","Data":"3ff185ab969673eaa33bc63ea1cb609a3148a80e0b3759b680e41f82d1fa19f3"} Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.676307 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ff185ab969673eaa33bc63ea1cb609a3148a80e0b3759b680e41f82d1fa19f3" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.676509 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.681885 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.682356 5008 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.682860 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.703176 5008 scope.go:117] "RemoveContainer" containerID="4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.704788 5008 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.705639 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.706230 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.723968 5008 scope.go:117] "RemoveContainer" containerID="7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.746060 5008 scope.go:117] "RemoveContainer" containerID="c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.769265 5008 scope.go:117] "RemoveContainer" containerID="1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.791908 5008 scope.go:117] "RemoveContainer" containerID="874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.815817 5008 scope.go:117] "RemoveContainer" containerID="ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237" Mar 18 18:07:24 crc kubenswrapper[5008]: E0318 18:07:24.816449 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\": container with ID starting with ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237 not found: ID does not exist" containerID="ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.816593 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237"} err="failed to get container status \"ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\": rpc error: code = NotFound desc = could not find container \"ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237\": container with ID starting with ffdec92ecfcb38eb494fd283945631e8cd9378901d1f73a916e0d61e04f92237 not found: ID does not exist" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.816648 5008 scope.go:117] "RemoveContainer" containerID="4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04" Mar 18 18:07:24 crc kubenswrapper[5008]: E0318 18:07:24.817423 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\": container with ID starting with 4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04 not found: ID does not exist" containerID="4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.817620 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04"} err="failed to get container status \"4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\": rpc error: code = NotFound desc = could not find container \"4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04\": container with ID starting with 4f0b5d81749e80f4742006e1281258b09448b7ede8b385760f5babbb177b5c04 not found: ID does not exist" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.817723 5008 scope.go:117] "RemoveContainer" containerID="7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7" Mar 18 18:07:24 crc kubenswrapper[5008]: E0318 18:07:24.818196 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\": container with ID starting with 7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7 not found: ID does not exist" containerID="7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.818451 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7"} err="failed to get container status \"7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\": rpc error: code = NotFound desc = could not find container \"7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7\": container with ID starting with 7dd5662d3369839bc392b06e48b61f20220586d7bc26f6c2f5959aee95e27db7 not found: ID does not exist" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.818542 5008 scope.go:117] "RemoveContainer" containerID="c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414" Mar 18 18:07:24 crc kubenswrapper[5008]: E0318 18:07:24.819006 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\": container with ID starting with c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414 not found: ID does not exist" containerID="c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.819049 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414"} err="failed to get container status \"c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\": rpc error: code = NotFound desc = could not find container \"c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414\": container with ID starting with c0452bc7d2f117175d304cb2944cfc9a3e5721deed90e3a6b7e90b4954617414 not found: ID does not exist" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.819078 5008 scope.go:117] "RemoveContainer" containerID="1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5" Mar 18 18:07:24 crc kubenswrapper[5008]: E0318 18:07:24.819474 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\": container with ID starting with 1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5 not found: ID does not exist" containerID="1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.819516 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5"} err="failed to get container status \"1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\": rpc error: code = NotFound desc = could not find container \"1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5\": container with ID starting with 1808ad861f8a25b40dc02ccfc09d7b936e2b6baf0b1f6965419e49f1cf26b8c5 not found: ID does not exist" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.819549 5008 scope.go:117] "RemoveContainer" containerID="874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82" Mar 18 18:07:24 crc kubenswrapper[5008]: E0318 18:07:24.819933 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\": container with ID starting with 874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82 not found: ID does not exist" containerID="874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82" Mar 18 18:07:24 crc kubenswrapper[5008]: I0318 18:07:24.819955 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82"} err="failed to get container status \"874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\": rpc error: code = NotFound desc = could not find container \"874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82\": container with ID starting with 874ed3e0c05601292003d59158122a9d7988cefbd8f074322649009ff5e08d82 not found: ID does not exist" Mar 18 18:07:26 crc kubenswrapper[5008]: I0318 18:07:26.212583 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 18:07:28 crc kubenswrapper[5008]: E0318 18:07:28.339434 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:28 crc kubenswrapper[5008]: E0318 18:07:28.340385 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:28 crc kubenswrapper[5008]: E0318 18:07:28.340953 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:28 crc kubenswrapper[5008]: E0318 18:07:28.341587 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:28 crc kubenswrapper[5008]: E0318 18:07:28.342156 5008 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:28 crc kubenswrapper[5008]: I0318 18:07:28.342216 5008 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 18:07:28 crc kubenswrapper[5008]: E0318 18:07:28.342690 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Mar 18 18:07:28 crc kubenswrapper[5008]: E0318 18:07:28.447448 5008 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e01c369eebd61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 18:07:22.038058337 +0000 UTC m=+298.557531456,LastTimestamp:2026-03-18 18:07:22.038058337 +0000 UTC m=+298.557531456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 18:07:28 crc kubenswrapper[5008]: E0318 18:07:28.543814 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Mar 18 18:07:28 crc kubenswrapper[5008]: E0318 18:07:28.944914 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Mar 18 18:07:29 crc kubenswrapper[5008]: E0318 18:07:29.746545 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Mar 18 18:07:31 crc kubenswrapper[5008]: E0318 18:07:31.347908 5008 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.198396 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.200042 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.200593 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.224737 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.224797 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:33 crc kubenswrapper[5008]: E0318 18:07:33.225459 5008 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.226299 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.758620 5008 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f9a136ff588c627c875b5b46d7ce11382f4780317f87edfb532721bc3b36f71e" exitCode=0 Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.758713 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f9a136ff588c627c875b5b46d7ce11382f4780317f87edfb532721bc3b36f71e"} Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.759430 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2d24d634913ac62f8b8be757b0aec296ee007b9b87824e0534f283064f2697af"} Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.759938 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.759981 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.760369 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:33 crc kubenswrapper[5008]: I0318 18:07:33.761104 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:33 crc kubenswrapper[5008]: E0318 18:07:33.761075 5008 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:34 crc kubenswrapper[5008]: I0318 18:07:34.210851 5008 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:34 crc kubenswrapper[5008]: I0318 18:07:34.211401 5008 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:34 crc kubenswrapper[5008]: I0318 18:07:34.211803 5008 status_manager.go:851] "Failed to get status for pod" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Mar 18 18:07:34 crc kubenswrapper[5008]: I0318 18:07:34.767985 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"67c7a5b8efd39e9f159d2c145def59263001333813da450d12749dbc7d3f3839"} Mar 18 18:07:34 crc kubenswrapper[5008]: I0318 18:07:34.768023 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e984682907012012d74ed129a7b54f530a2918dcadf276d9f9789a354da4a268"} Mar 18 18:07:34 crc kubenswrapper[5008]: I0318 18:07:34.768031 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"31d423c6ab241c02ab9f2cf7f27530287cd43c5cbc8b4283f8a4003ff439f1fe"} Mar 18 18:07:35 crc kubenswrapper[5008]: I0318 18:07:35.776698 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"407cd97603232d67f01fe1699e4a660559168192027cce64942fe0e18dbf5cfc"} Mar 18 18:07:35 crc kubenswrapper[5008]: I0318 18:07:35.777073 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"27a2c7851134c0a4db02b6a95f4ea55accbf79ce1eac6312abc411e57e27f4ec"} Mar 18 18:07:35 crc kubenswrapper[5008]: I0318 18:07:35.777096 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:35 crc kubenswrapper[5008]: I0318 18:07:35.776992 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:35 crc kubenswrapper[5008]: I0318 18:07:35.777120 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:36 crc kubenswrapper[5008]: I0318 18:07:36.784184 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 18:07:36 crc kubenswrapper[5008]: I0318 18:07:36.784810 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 18:07:36 crc kubenswrapper[5008]: I0318 18:07:36.784869 5008 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec" exitCode=1 Mar 18 18:07:36 crc kubenswrapper[5008]: I0318 18:07:36.784902 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec"} Mar 18 18:07:36 crc kubenswrapper[5008]: I0318 18:07:36.785456 5008 scope.go:117] "RemoveContainer" containerID="f11fed99b3e0c3592033b1e88ef8e6316eaeb569687a3141c3ed629fe1ba64ec" Mar 18 18:07:36 crc kubenswrapper[5008]: I0318 18:07:36.909842 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:07:37 crc kubenswrapper[5008]: I0318 18:07:37.792690 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 18:07:37 crc kubenswrapper[5008]: I0318 18:07:37.793414 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 18:07:37 crc kubenswrapper[5008]: I0318 18:07:37.793463 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7a7d6e8d8e594b65f3eac799d33b7223e2b97a177d075814ad4f3f8308a394a3"} Mar 18 18:07:38 crc kubenswrapper[5008]: I0318 18:07:38.226687 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:38 crc kubenswrapper[5008]: I0318 18:07:38.226734 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:38 crc kubenswrapper[5008]: I0318 18:07:38.232518 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:40 crc kubenswrapper[5008]: I0318 18:07:40.786567 5008 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:40 crc kubenswrapper[5008]: I0318 18:07:40.810803 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:40 crc kubenswrapper[5008]: I0318 18:07:40.810855 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:40 crc kubenswrapper[5008]: I0318 18:07:40.815992 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:40 crc kubenswrapper[5008]: I0318 18:07:40.825637 5008 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76c31bc-28af-4476-8a03-e9250a873fa6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:07:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:07:33Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T18:07:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d423c6ab241c02ab9f2cf7f27530287cd43c5cbc8b4283f8a4003ff439f1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67c7a5b8efd39e9f159d2c145def59263001333813da450d12749dbc7d3f3839\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e984682907012012d74ed129a7b54f530a2918dcadf276d9f9789a354da4a268\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://407cd97603232d67f01fe1699e4a660559168192027cce64942fe0e18dbf5cfc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:07:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a2c7851134c0a4db02b6a95f4ea55accbf79ce1eac6312abc411e57e27f4ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T18:07:34Z\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a136ff588c627c875b5b46d7ce11382f4780317f87edfb532721bc3b36f71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9a136ff588c627c875b5b46d7ce11382f4780317f87edfb532721bc3b36f71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T18:07:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T18:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}]}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Pod \"kube-apiserver-crc\" is invalid: metadata.uid: Invalid value: \"e76c31bc-28af-4476-8a03-e9250a873fa6\": field is immutable" Mar 18 18:07:40 crc kubenswrapper[5008]: I0318 18:07:40.874291 5008 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="574560a2-7706-4850-808c-fd6cff430089" Mar 18 18:07:41 crc kubenswrapper[5008]: I0318 18:07:41.814746 5008 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:41 crc kubenswrapper[5008]: I0318 18:07:41.814781 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e76c31bc-28af-4476-8a03-e9250a873fa6" Mar 18 18:07:41 crc kubenswrapper[5008]: I0318 18:07:41.818299 5008 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="574560a2-7706-4850-808c-fd6cff430089" Mar 18 18:07:42 crc kubenswrapper[5008]: I0318 18:07:42.632296 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:07:45 crc kubenswrapper[5008]: I0318 18:07:45.648239 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:07:45 crc kubenswrapper[5008]: I0318 18:07:45.656850 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.172889 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.172971 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.173047 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.173103 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.175713 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.176138 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.176732 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.185322 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.185841 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.193878 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.202422 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.202541 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.214204 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.223710 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.237347 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:07:46 crc kubenswrapper[5008]: W0318 18:07:46.783262 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-a6fb8a0d5bc931545f043c6b629c01b78ebc5b9bef3d71a458b13d3653f44a1e WatchSource:0}: Error finding container a6fb8a0d5bc931545f043c6b629c01b78ebc5b9bef3d71a458b13d3653f44a1e: Status 404 returned error can't find the container with id a6fb8a0d5bc931545f043c6b629c01b78ebc5b9bef3d71a458b13d3653f44a1e Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.862996 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a6fb8a0d5bc931545f043c6b629c01b78ebc5b9bef3d71a458b13d3653f44a1e"} Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.866095 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b211c645352879f5dfa5defdbefd2ff9c5bbb03a70632bcadaaa6a0865f978ff"} Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.866143 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7a3e70240658aab82251b2e74e8baf30c66c8d26783537785b281890e7fcaab8"} Mar 18 18:07:46 crc kubenswrapper[5008]: I0318 18:07:46.867967 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6e7c88d8d4d796959221e0818d3ea8021b36afdc2dc15e82167c2ec0a1124552"} Mar 18 18:07:47 crc kubenswrapper[5008]: I0318 18:07:47.879027 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"847e7c56b67b6438f9834dac19e021af731a6826888c3b48074a2584b838fec3"} Mar 18 18:07:47 crc kubenswrapper[5008]: I0318 18:07:47.881265 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4377e4fa2ff82ad9b0175893d4a9cac0c459fde76f14e0ab7a60fe2834017e00"} Mar 18 18:07:48 crc kubenswrapper[5008]: I0318 18:07:48.888232 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 18:07:48 crc kubenswrapper[5008]: I0318 18:07:48.888307 5008 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="b211c645352879f5dfa5defdbefd2ff9c5bbb03a70632bcadaaa6a0865f978ff" exitCode=255 Mar 18 18:07:48 crc kubenswrapper[5008]: I0318 18:07:48.888432 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"b211c645352879f5dfa5defdbefd2ff9c5bbb03a70632bcadaaa6a0865f978ff"} Mar 18 18:07:48 crc kubenswrapper[5008]: I0318 18:07:48.889944 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:07:48 crc kubenswrapper[5008]: I0318 18:07:48.890636 5008 scope.go:117] "RemoveContainer" containerID="b211c645352879f5dfa5defdbefd2ff9c5bbb03a70632bcadaaa6a0865f978ff" Mar 18 18:07:49 crc kubenswrapper[5008]: I0318 18:07:49.896239 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 18:07:49 crc kubenswrapper[5008]: I0318 18:07:49.900480 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 18:07:49 crc kubenswrapper[5008]: I0318 18:07:49.900527 5008 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="a101098227432d58a2721c979bff1e6ccbbd07c9c1f0be52b2db58641297fb02" exitCode=255 Mar 18 18:07:49 crc kubenswrapper[5008]: I0318 18:07:49.900605 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"a101098227432d58a2721c979bff1e6ccbbd07c9c1f0be52b2db58641297fb02"} Mar 18 18:07:49 crc kubenswrapper[5008]: I0318 18:07:49.900698 5008 scope.go:117] "RemoveContainer" containerID="b211c645352879f5dfa5defdbefd2ff9c5bbb03a70632bcadaaa6a0865f978ff" Mar 18 18:07:49 crc kubenswrapper[5008]: I0318 18:07:49.901659 5008 scope.go:117] "RemoveContainer" containerID="a101098227432d58a2721c979bff1e6ccbbd07c9c1f0be52b2db58641297fb02" Mar 18 18:07:49 crc kubenswrapper[5008]: E0318 18:07:49.901956 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 18:07:50 crc kubenswrapper[5008]: I0318 18:07:50.075960 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 18:07:50 crc kubenswrapper[5008]: I0318 18:07:50.508283 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 18:07:50 crc kubenswrapper[5008]: I0318 18:07:50.824825 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 18:07:50 crc kubenswrapper[5008]: I0318 18:07:50.910893 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 18:07:51 crc kubenswrapper[5008]: I0318 18:07:51.090108 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 18:07:51 crc kubenswrapper[5008]: I0318 18:07:51.490803 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 18:07:51 crc kubenswrapper[5008]: I0318 18:07:51.607395 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 18:07:51 crc kubenswrapper[5008]: I0318 18:07:51.673916 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 18:07:52 crc kubenswrapper[5008]: I0318 18:07:52.200763 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 18:07:52 crc kubenswrapper[5008]: I0318 18:07:52.219437 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 18:07:52 crc kubenswrapper[5008]: I0318 18:07:52.395144 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 18:07:52 crc kubenswrapper[5008]: I0318 18:07:52.521776 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 18:07:52 crc kubenswrapper[5008]: I0318 18:07:52.639885 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 18:07:53 crc kubenswrapper[5008]: I0318 18:07:53.108741 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 18:07:53 crc kubenswrapper[5008]: I0318 18:07:53.894607 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 18:07:53 crc kubenswrapper[5008]: I0318 18:07:53.992702 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 18:07:54 crc kubenswrapper[5008]: I0318 18:07:54.061974 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 18:07:54 crc kubenswrapper[5008]: I0318 18:07:54.095473 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 18:07:54 crc kubenswrapper[5008]: I0318 18:07:54.190986 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 18:07:54 crc kubenswrapper[5008]: I0318 18:07:54.223680 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 18:07:54 crc kubenswrapper[5008]: I0318 18:07:54.278451 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 18:07:54 crc kubenswrapper[5008]: I0318 18:07:54.437515 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 18:07:54 crc kubenswrapper[5008]: I0318 18:07:54.455385 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 18:07:54 crc kubenswrapper[5008]: I0318 18:07:54.746360 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.037181 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.129352 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.196736 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.212891 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.226068 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.278206 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.286762 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.342521 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.562500 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.697841 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.708901 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.756034 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.760321 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.827890 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 18:07:55 crc kubenswrapper[5008]: I0318 18:07:55.835201 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.087069 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.253986 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.289724 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.324435 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.364203 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.466747 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.496657 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.508029 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.602824 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 18:07:56 crc kubenswrapper[5008]: I0318 18:07:56.898078 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.133390 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.158780 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.190992 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.220472 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.251834 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.252061 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.254601 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.321342 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.394110 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.529829 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.623326 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.645964 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.652989 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.709254 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.726722 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.823835 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.963187 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.963505 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.964912 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 18:07:57 crc kubenswrapper[5008]: I0318 18:07:57.980641 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.008423 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.063121 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.067405 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.311687 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.429275 5008 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.513453 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.571272 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.625609 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.647527 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.700960 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.724909 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.751159 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.854066 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.881703 5008 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 18:07:58 crc kubenswrapper[5008]: I0318 18:07:58.891822 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.016236 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.031395 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.035823 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.120274 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.127941 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.140203 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.255342 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.269967 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.276808 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.331200 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.345539 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.400169 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.404255 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.435016 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.465286 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.489955 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.536331 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.536627 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.605132 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.705265 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.720937 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.727002 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.727455 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.767324 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.794247 5008 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.797668 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=38.797639573 podStartE2EDuration="38.797639573s" podCreationTimestamp="2026-03-18 18:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:07:40.854419355 +0000 UTC m=+317.373892434" watchObservedRunningTime="2026-03-18 18:07:59.797639573 +0000 UTC m=+336.317112682" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.800765 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.802158 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.802226 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.808622 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.828047 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.828021891 podStartE2EDuration="19.828021891s" podCreationTimestamp="2026-03-18 18:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:07:59.825044082 +0000 UTC m=+336.344517201" watchObservedRunningTime="2026-03-18 18:07:59.828021891 +0000 UTC m=+336.347495010" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.853502 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.856119 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 18:07:59 crc kubenswrapper[5008]: I0318 18:07:59.856476 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.102262 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.193614 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.205364 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.209204 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564288-l8f4q"] Mar 18 18:08:00 crc kubenswrapper[5008]: E0318 18:08:00.209598 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" containerName="installer" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.209633 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" containerName="installer" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.209811 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c1c7b1-ecc0-4966-9be8-536fcd15335e" containerName="installer" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.210447 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.213211 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.213802 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.214062 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.269789 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7xvv\" (UniqueName: \"kubernetes.io/projected/17399326-c5b7-4432-a967-87849a37fc80-kube-api-access-t7xvv\") pod \"auto-csr-approver-29564288-l8f4q\" (UID: \"17399326-c5b7-4432-a967-87849a37fc80\") " pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.371807 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7xvv\" (UniqueName: \"kubernetes.io/projected/17399326-c5b7-4432-a967-87849a37fc80-kube-api-access-t7xvv\") pod \"auto-csr-approver-29564288-l8f4q\" (UID: \"17399326-c5b7-4432-a967-87849a37fc80\") " pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.388511 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.393915 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7xvv\" (UniqueName: \"kubernetes.io/projected/17399326-c5b7-4432-a967-87849a37fc80-kube-api-access-t7xvv\") pod \"auto-csr-approver-29564288-l8f4q\" (UID: \"17399326-c5b7-4432-a967-87849a37fc80\") " pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.460240 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.496144 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.505171 5008 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.509752 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.538083 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.543102 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.634266 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.691658 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.812854 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.816837 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.839340 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.958640 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 18:08:00 crc kubenswrapper[5008]: I0318 18:08:00.981363 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.116496 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.171124 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.212069 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.225380 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.258888 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.268282 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.292153 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.302820 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.330022 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.366707 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.388900 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.403634 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.428478 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.465155 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.476063 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.482186 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.515627 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.540106 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.555167 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.561789 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.570448 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.600845 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.654686 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.697321 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.719244 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.793010 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.878965 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.983766 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 18:08:01 crc kubenswrapper[5008]: I0318 18:08:01.999540 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.048134 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.141494 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.358790 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.409910 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.425197 5008 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.473717 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.523374 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.603549 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.618522 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.653504 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.659861 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.733360 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.865001 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.892167 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 18:08:02 crc kubenswrapper[5008]: I0318 18:08:02.910858 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.020156 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.071485 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.113110 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.148673 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.164533 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.177075 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.198385 5008 scope.go:117] "RemoveContainer" containerID="a101098227432d58a2721c979bff1e6ccbbd07c9c1f0be52b2db58641297fb02" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.258247 5008 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.259024 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf" gracePeriod=5 Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.265813 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.349523 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.363466 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.435110 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.473019 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.557688 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.620712 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-l8f4q"] Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.646020 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.706307 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.878277 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.995849 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 18:08:03 crc kubenswrapper[5008]: I0318 18:08:03.995894 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"be493a1f8638a34dc2f9398f9909fe6c7917e5f135cf35d18b327343b9db19c0"} Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.060050 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.239707 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.241301 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.266684 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 18:08:04 crc kubenswrapper[5008]: E0318 18:08:04.323967 5008 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 18:08:04 crc kubenswrapper[5008]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564288-l8f4q_openshift-infra_17399326-c5b7-4432-a967-87849a37fc80_0(599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c): error adding pod openshift-infra_auto-csr-approver-29564288-l8f4q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c" Netns:"/var/run/netns/eeb098f3-39a0-4b85-80ff-40fc5a45d8f1" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29564288-l8f4q;K8S_POD_INFRA_CONTAINER_ID=599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c;K8S_POD_UID=17399326-c5b7-4432-a967-87849a37fc80" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29564288-l8f4q] networking: Multus: [openshift-infra/auto-csr-approver-29564288-l8f4q/17399326-c5b7-4432-a967-87849a37fc80]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29564288-l8f4q in out of cluster comm: pod "auto-csr-approver-29564288-l8f4q" not found Mar 18 18:08:04 crc kubenswrapper[5008]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 18:08:04 crc kubenswrapper[5008]: > Mar 18 18:08:04 crc kubenswrapper[5008]: E0318 18:08:04.324052 5008 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 18:08:04 crc kubenswrapper[5008]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564288-l8f4q_openshift-infra_17399326-c5b7-4432-a967-87849a37fc80_0(599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c): error adding pod openshift-infra_auto-csr-approver-29564288-l8f4q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c" Netns:"/var/run/netns/eeb098f3-39a0-4b85-80ff-40fc5a45d8f1" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29564288-l8f4q;K8S_POD_INFRA_CONTAINER_ID=599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c;K8S_POD_UID=17399326-c5b7-4432-a967-87849a37fc80" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29564288-l8f4q] networking: Multus: [openshift-infra/auto-csr-approver-29564288-l8f4q/17399326-c5b7-4432-a967-87849a37fc80]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29564288-l8f4q in out of cluster comm: pod "auto-csr-approver-29564288-l8f4q" not found Mar 18 18:08:04 crc kubenswrapper[5008]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 18:08:04 crc kubenswrapper[5008]: > pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:04 crc kubenswrapper[5008]: E0318 18:08:04.324071 5008 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 18:08:04 crc kubenswrapper[5008]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564288-l8f4q_openshift-infra_17399326-c5b7-4432-a967-87849a37fc80_0(599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c): error adding pod openshift-infra_auto-csr-approver-29564288-l8f4q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c" Netns:"/var/run/netns/eeb098f3-39a0-4b85-80ff-40fc5a45d8f1" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29564288-l8f4q;K8S_POD_INFRA_CONTAINER_ID=599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c;K8S_POD_UID=17399326-c5b7-4432-a967-87849a37fc80" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29564288-l8f4q] networking: Multus: [openshift-infra/auto-csr-approver-29564288-l8f4q/17399326-c5b7-4432-a967-87849a37fc80]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29564288-l8f4q in out of cluster comm: pod "auto-csr-approver-29564288-l8f4q" not found Mar 18 18:08:04 crc kubenswrapper[5008]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 18:08:04 crc kubenswrapper[5008]: > pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:04 crc kubenswrapper[5008]: E0318 18:08:04.324134 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29564288-l8f4q_openshift-infra(17399326-c5b7-4432-a967-87849a37fc80)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29564288-l8f4q_openshift-infra(17399326-c5b7-4432-a967-87849a37fc80)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29564288-l8f4q_openshift-infra_17399326-c5b7-4432-a967-87849a37fc80_0(599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c): error adding pod openshift-infra_auto-csr-approver-29564288-l8f4q to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c\\\" Netns:\\\"/var/run/netns/eeb098f3-39a0-4b85-80ff-40fc5a45d8f1\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29564288-l8f4q;K8S_POD_INFRA_CONTAINER_ID=599029fc25eaafaaa41bdd24917c8bc984688c1a0ddda545a34eeb27ad0f4a5c;K8S_POD_UID=17399326-c5b7-4432-a967-87849a37fc80\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29564288-l8f4q] networking: Multus: [openshift-infra/auto-csr-approver-29564288-l8f4q/17399326-c5b7-4432-a967-87849a37fc80]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29564288-l8f4q in out of cluster comm: pod \\\"auto-csr-approver-29564288-l8f4q\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" podUID="17399326-c5b7-4432-a967-87849a37fc80" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.480742 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.512902 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.547063 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.550889 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.589920 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.621970 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.679127 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.691515 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.697583 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.723859 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.750364 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.804986 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.882102 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.941633 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.983858 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.999414 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:04 crc kubenswrapper[5008]: I0318 18:08:04.999828 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.060192 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.074813 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.122868 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.194533 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.242059 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.483400 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.487093 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.572760 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.652089 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.857338 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.893820 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 18:08:05 crc kubenswrapper[5008]: I0318 18:08:05.950868 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.088484 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.096792 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.104801 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.137652 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.279982 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.280007 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.334595 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.362295 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.394337 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.402535 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.601286 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-l8f4q"] Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.795152 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 18:08:06 crc kubenswrapper[5008]: I0318 18:08:06.911875 5008 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.010690 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.013431 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" event={"ID":"17399326-c5b7-4432-a967-87849a37fc80","Type":"ContainerStarted","Data":"d392a35a10ee63804cab4986cabb952c871589f28abec532782c683d296f17af"} Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.094379 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.215294 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.452849 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.483523 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.621109 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.665843 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.678057 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.771672 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.837462 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.863599 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 18:08:07 crc kubenswrapper[5008]: I0318 18:08:07.864945 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.095449 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.228794 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.376980 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.604663 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.611606 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.758164 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.843617 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.843693 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982299 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982354 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982422 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982482 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982504 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982627 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982668 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982738 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982743 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982886 5008 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982919 5008 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982932 5008 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.982943 5008 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:08 crc kubenswrapper[5008]: I0318 18:08:08.991140 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.031992 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.032064 5008 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf" exitCode=137 Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.032170 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.032188 5008 scope.go:117] "RemoveContainer" containerID="590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf" Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.034124 5008 generic.go:334] "Generic (PLEG): container finished" podID="17399326-c5b7-4432-a967-87849a37fc80" containerID="3a0973bb6d788209f3f6165f4dd85a1b5820d107f7a5172f083f4787d6d13a46" exitCode=0 Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.034174 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" event={"ID":"17399326-c5b7-4432-a967-87849a37fc80","Type":"ContainerDied","Data":"3a0973bb6d788209f3f6165f4dd85a1b5820d107f7a5172f083f4787d6d13a46"} Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.054845 5008 scope.go:117] "RemoveContainer" containerID="590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf" Mar 18 18:08:09 crc kubenswrapper[5008]: E0318 18:08:09.055354 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf\": container with ID starting with 590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf not found: ID does not exist" containerID="590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf" Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.055394 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf"} err="failed to get container status \"590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf\": rpc error: code = NotFound desc = could not find container \"590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf\": container with ID starting with 590b1953740af554f9dab8fa040bf5f70a5915f88385c0a79d447f29fe175abf not found: ID does not exist" Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.083785 5008 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:09 crc kubenswrapper[5008]: I0318 18:08:09.316963 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.206372 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.207083 5008 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.223196 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.223235 5008 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d28c7b87-a976-479d-9284-65f70980f815" Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.230224 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.230277 5008 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d28c7b87-a976-479d-9284-65f70980f815" Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.334040 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.402369 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7xvv\" (UniqueName: \"kubernetes.io/projected/17399326-c5b7-4432-a967-87849a37fc80-kube-api-access-t7xvv\") pod \"17399326-c5b7-4432-a967-87849a37fc80\" (UID: \"17399326-c5b7-4432-a967-87849a37fc80\") " Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.409783 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17399326-c5b7-4432-a967-87849a37fc80-kube-api-access-t7xvv" (OuterVolumeSpecName: "kube-api-access-t7xvv") pod "17399326-c5b7-4432-a967-87849a37fc80" (UID: "17399326-c5b7-4432-a967-87849a37fc80"). InnerVolumeSpecName "kube-api-access-t7xvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:08:10 crc kubenswrapper[5008]: I0318 18:08:10.504443 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7xvv\" (UniqueName: \"kubernetes.io/projected/17399326-c5b7-4432-a967-87849a37fc80-kube-api-access-t7xvv\") on node \"crc\" DevicePath \"\"" Mar 18 18:08:11 crc kubenswrapper[5008]: I0318 18:08:11.053889 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" event={"ID":"17399326-c5b7-4432-a967-87849a37fc80","Type":"ContainerDied","Data":"d392a35a10ee63804cab4986cabb952c871589f28abec532782c683d296f17af"} Mar 18 18:08:11 crc kubenswrapper[5008]: I0318 18:08:11.053938 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d392a35a10ee63804cab4986cabb952c871589f28abec532782c683d296f17af" Mar 18 18:08:11 crc kubenswrapper[5008]: I0318 18:08:11.054005 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564288-l8f4q" Mar 18 18:08:16 crc kubenswrapper[5008]: I0318 18:08:16.264063 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 18:08:29 crc kubenswrapper[5008]: I0318 18:08:29.168463 5008 generic.go:334] "Generic (PLEG): container finished" podID="c7314b49-e434-4c49-babe-7ebc3925639a" containerID="7bf93c3c418e5c8e6845b55e4a2e6cf21dc2e2700955ddb8f903fb257bab3497" exitCode=0 Mar 18 18:08:29 crc kubenswrapper[5008]: I0318 18:08:29.168604 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" event={"ID":"c7314b49-e434-4c49-babe-7ebc3925639a","Type":"ContainerDied","Data":"7bf93c3c418e5c8e6845b55e4a2e6cf21dc2e2700955ddb8f903fb257bab3497"} Mar 18 18:08:29 crc kubenswrapper[5008]: I0318 18:08:29.169687 5008 scope.go:117] "RemoveContainer" containerID="7bf93c3c418e5c8e6845b55e4a2e6cf21dc2e2700955ddb8f903fb257bab3497" Mar 18 18:08:30 crc kubenswrapper[5008]: I0318 18:08:30.184641 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" event={"ID":"c7314b49-e434-4c49-babe-7ebc3925639a","Type":"ContainerStarted","Data":"04594d8db88de3721af003a18676c11a9781a11cfca8bc18ab32f21f57d8640c"} Mar 18 18:08:30 crc kubenswrapper[5008]: I0318 18:08:30.185063 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:08:30 crc kubenswrapper[5008]: I0318 18:08:30.188633 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:08:54 crc kubenswrapper[5008]: I0318 18:08:54.459742 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:08:54 crc kubenswrapper[5008]: I0318 18:08:54.460315 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.503225 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wb6qx"] Mar 18 18:09:23 crc kubenswrapper[5008]: E0318 18:09:23.503947 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.503959 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 18:09:23 crc kubenswrapper[5008]: E0318 18:09:23.503970 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17399326-c5b7-4432-a967-87849a37fc80" containerName="oc" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.503976 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="17399326-c5b7-4432-a967-87849a37fc80" containerName="oc" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.504072 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.504084 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="17399326-c5b7-4432-a967-87849a37fc80" containerName="oc" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.504411 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.529134 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wb6qx"] Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.694050 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3387feb9-2e24-4865-bd9f-7caca5d45e6c-registry-tls\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.694141 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3387feb9-2e24-4865-bd9f-7caca5d45e6c-trusted-ca\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.694204 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3387feb9-2e24-4865-bd9f-7caca5d45e6c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.694242 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3387feb9-2e24-4865-bd9f-7caca5d45e6c-bound-sa-token\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.694313 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.694355 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlfnp\" (UniqueName: \"kubernetes.io/projected/3387feb9-2e24-4865-bd9f-7caca5d45e6c-kube-api-access-rlfnp\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.694399 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3387feb9-2e24-4865-bd9f-7caca5d45e6c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.694467 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3387feb9-2e24-4865-bd9f-7caca5d45e6c-registry-certificates\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.724216 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.795940 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlfnp\" (UniqueName: \"kubernetes.io/projected/3387feb9-2e24-4865-bd9f-7caca5d45e6c-kube-api-access-rlfnp\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.795992 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3387feb9-2e24-4865-bd9f-7caca5d45e6c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.796037 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3387feb9-2e24-4865-bd9f-7caca5d45e6c-registry-certificates\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.796074 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3387feb9-2e24-4865-bd9f-7caca5d45e6c-registry-tls\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.796102 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3387feb9-2e24-4865-bd9f-7caca5d45e6c-trusted-ca\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.796132 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3387feb9-2e24-4865-bd9f-7caca5d45e6c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.796157 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3387feb9-2e24-4865-bd9f-7caca5d45e6c-bound-sa-token\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.796607 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3387feb9-2e24-4865-bd9f-7caca5d45e6c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.797524 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3387feb9-2e24-4865-bd9f-7caca5d45e6c-trusted-ca\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.797966 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3387feb9-2e24-4865-bd9f-7caca5d45e6c-registry-certificates\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.803105 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3387feb9-2e24-4865-bd9f-7caca5d45e6c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.803194 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3387feb9-2e24-4865-bd9f-7caca5d45e6c-registry-tls\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.816994 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlfnp\" (UniqueName: \"kubernetes.io/projected/3387feb9-2e24-4865-bd9f-7caca5d45e6c-kube-api-access-rlfnp\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.817785 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3387feb9-2e24-4865-bd9f-7caca5d45e6c-bound-sa-token\") pod \"image-registry-66df7c8f76-wb6qx\" (UID: \"3387feb9-2e24-4865-bd9f-7caca5d45e6c\") " pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:23 crc kubenswrapper[5008]: I0318 18:09:23.898547 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:24 crc kubenswrapper[5008]: I0318 18:09:24.354440 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wb6qx"] Mar 18 18:09:24 crc kubenswrapper[5008]: W0318 18:09:24.365082 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3387feb9_2e24_4865_bd9f_7caca5d45e6c.slice/crio-b33708e6c3ce496c0c08f8e06c1e9fd1bfe107afb597bd891142175abd53a0fb WatchSource:0}: Error finding container b33708e6c3ce496c0c08f8e06c1e9fd1bfe107afb597bd891142175abd53a0fb: Status 404 returned error can't find the container with id b33708e6c3ce496c0c08f8e06c1e9fd1bfe107afb597bd891142175abd53a0fb Mar 18 18:09:24 crc kubenswrapper[5008]: I0318 18:09:24.460803 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:09:24 crc kubenswrapper[5008]: I0318 18:09:24.461081 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:09:24 crc kubenswrapper[5008]: I0318 18:09:24.604249 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" event={"ID":"3387feb9-2e24-4865-bd9f-7caca5d45e6c","Type":"ContainerStarted","Data":"b33708e6c3ce496c0c08f8e06c1e9fd1bfe107afb597bd891142175abd53a0fb"} Mar 18 18:09:25 crc kubenswrapper[5008]: I0318 18:09:25.611061 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" event={"ID":"3387feb9-2e24-4865-bd9f-7caca5d45e6c","Type":"ContainerStarted","Data":"b796f41b2198221dd016bb5a8c99a93c8a0f934515d9a4c0cfe6eeb434cd6a02"} Mar 18 18:09:25 crc kubenswrapper[5008]: I0318 18:09:25.611398 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:25 crc kubenswrapper[5008]: I0318 18:09:25.631772 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" podStartSLOduration=2.631750944 podStartE2EDuration="2.631750944s" podCreationTimestamp="2026-03-18 18:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:09:25.629698791 +0000 UTC m=+422.149171880" watchObservedRunningTime="2026-03-18 18:09:25.631750944 +0000 UTC m=+422.151224043" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.546570 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xth4j"] Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.547300 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xth4j" podUID="420d2432-4da0-4be3-8489-56c06a682e03" containerName="registry-server" containerID="cri-o://1cbac443129b9336bfcf92ffdb3a27b95779323162385c2b5960f203fd7ce80b" gracePeriod=30 Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.557915 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b59s5"] Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.558152 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b59s5" podUID="9209339f-be23-444c-a635-04920e6a0cf6" containerName="registry-server" containerID="cri-o://2ce79d08f9d51b90cdf84677ccf94509cc2f99e596171e28bb256d4623a0adc1" gracePeriod=30 Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.568356 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tfg8c"] Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.568644 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" containerName="marketplace-operator" containerID="cri-o://04594d8db88de3721af003a18676c11a9781a11cfca8bc18ab32f21f57d8640c" gracePeriod=30 Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.582601 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrtz8"] Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.582935 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vrtz8" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerName="registry-server" containerID="cri-o://5805c353901b2fa8e9bbcdf9c04063f82345b79b1305fbee5410a5bfa21d3062" gracePeriod=30 Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.590415 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mvtrx"] Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.591190 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.593926 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5nhgk"] Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.594136 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5nhgk" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerName="registry-server" containerID="cri-o://4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a" gracePeriod=30 Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.608338 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mvtrx"] Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.723965 5008 generic.go:334] "Generic (PLEG): container finished" podID="420d2432-4da0-4be3-8489-56c06a682e03" containerID="1cbac443129b9336bfcf92ffdb3a27b95779323162385c2b5960f203fd7ce80b" exitCode=0 Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.724457 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xth4j" event={"ID":"420d2432-4da0-4be3-8489-56c06a682e03","Type":"ContainerDied","Data":"1cbac443129b9336bfcf92ffdb3a27b95779323162385c2b5960f203fd7ce80b"} Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.736219 5008 generic.go:334] "Generic (PLEG): container finished" podID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerID="5805c353901b2fa8e9bbcdf9c04063f82345b79b1305fbee5410a5bfa21d3062" exitCode=0 Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.736278 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrtz8" event={"ID":"5d041acc-48d2-4f2f-896f-94893b9ff41f","Type":"ContainerDied","Data":"5805c353901b2fa8e9bbcdf9c04063f82345b79b1305fbee5410a5bfa21d3062"} Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.740053 5008 generic.go:334] "Generic (PLEG): container finished" podID="9209339f-be23-444c-a635-04920e6a0cf6" containerID="2ce79d08f9d51b90cdf84677ccf94509cc2f99e596171e28bb256d4623a0adc1" exitCode=0 Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.740096 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b59s5" event={"ID":"9209339f-be23-444c-a635-04920e6a0cf6","Type":"ContainerDied","Data":"2ce79d08f9d51b90cdf84677ccf94509cc2f99e596171e28bb256d4623a0adc1"} Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.744569 5008 generic.go:334] "Generic (PLEG): container finished" podID="c7314b49-e434-4c49-babe-7ebc3925639a" containerID="04594d8db88de3721af003a18676c11a9781a11cfca8bc18ab32f21f57d8640c" exitCode=0 Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.744593 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" event={"ID":"c7314b49-e434-4c49-babe-7ebc3925639a","Type":"ContainerDied","Data":"04594d8db88de3721af003a18676c11a9781a11cfca8bc18ab32f21f57d8640c"} Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.744618 5008 scope.go:117] "RemoveContainer" containerID="7bf93c3c418e5c8e6845b55e4a2e6cf21dc2e2700955ddb8f903fb257bab3497" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.752517 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36bc5b4f-3a28-4f99-9c2d-e521639c546a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mvtrx\" (UID: \"36bc5b4f-3a28-4f99-9c2d-e521639c546a\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.752604 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n57k\" (UniqueName: \"kubernetes.io/projected/36bc5b4f-3a28-4f99-9c2d-e521639c546a-kube-api-access-4n57k\") pod \"marketplace-operator-79b997595-mvtrx\" (UID: \"36bc5b4f-3a28-4f99-9c2d-e521639c546a\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.752628 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36bc5b4f-3a28-4f99-9c2d-e521639c546a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mvtrx\" (UID: \"36bc5b4f-3a28-4f99-9c2d-e521639c546a\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.853866 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n57k\" (UniqueName: \"kubernetes.io/projected/36bc5b4f-3a28-4f99-9c2d-e521639c546a-kube-api-access-4n57k\") pod \"marketplace-operator-79b997595-mvtrx\" (UID: \"36bc5b4f-3a28-4f99-9c2d-e521639c546a\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.853955 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36bc5b4f-3a28-4f99-9c2d-e521639c546a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mvtrx\" (UID: \"36bc5b4f-3a28-4f99-9c2d-e521639c546a\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.854998 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36bc5b4f-3a28-4f99-9c2d-e521639c546a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mvtrx\" (UID: \"36bc5b4f-3a28-4f99-9c2d-e521639c546a\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.856319 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36bc5b4f-3a28-4f99-9c2d-e521639c546a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mvtrx\" (UID: \"36bc5b4f-3a28-4f99-9c2d-e521639c546a\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.858761 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36bc5b4f-3a28-4f99-9c2d-e521639c546a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mvtrx\" (UID: \"36bc5b4f-3a28-4f99-9c2d-e521639c546a\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:38 crc kubenswrapper[5008]: I0318 18:09:38.870115 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n57k\" (UniqueName: \"kubernetes.io/projected/36bc5b4f-3a28-4f99-9c2d-e521639c546a-kube-api-access-4n57k\") pod \"marketplace-operator-79b997595-mvtrx\" (UID: \"36bc5b4f-3a28-4f99-9c2d-e521639c546a\") " pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.009129 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.023028 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.061140 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.065322 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.085002 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.093754 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.161865 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-utilities\") pod \"5d041acc-48d2-4f2f-896f-94893b9ff41f\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.162089 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw8rk\" (UniqueName: \"kubernetes.io/projected/5d041acc-48d2-4f2f-896f-94893b9ff41f-kube-api-access-tw8rk\") pod \"5d041acc-48d2-4f2f-896f-94893b9ff41f\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.162115 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-catalog-content\") pod \"420d2432-4da0-4be3-8489-56c06a682e03\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.162140 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p7jc\" (UniqueName: \"kubernetes.io/projected/420d2432-4da0-4be3-8489-56c06a682e03-kube-api-access-5p7jc\") pod \"420d2432-4da0-4be3-8489-56c06a682e03\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.162165 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-utilities\") pod \"420d2432-4da0-4be3-8489-56c06a682e03\" (UID: \"420d2432-4da0-4be3-8489-56c06a682e03\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.162183 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-catalog-content\") pod \"5d041acc-48d2-4f2f-896f-94893b9ff41f\" (UID: \"5d041acc-48d2-4f2f-896f-94893b9ff41f\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.166035 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-utilities" (OuterVolumeSpecName: "utilities") pod "5d041acc-48d2-4f2f-896f-94893b9ff41f" (UID: "5d041acc-48d2-4f2f-896f-94893b9ff41f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.167252 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/420d2432-4da0-4be3-8489-56c06a682e03-kube-api-access-5p7jc" (OuterVolumeSpecName: "kube-api-access-5p7jc") pod "420d2432-4da0-4be3-8489-56c06a682e03" (UID: "420d2432-4da0-4be3-8489-56c06a682e03"). InnerVolumeSpecName "kube-api-access-5p7jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.168244 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d041acc-48d2-4f2f-896f-94893b9ff41f-kube-api-access-tw8rk" (OuterVolumeSpecName: "kube-api-access-tw8rk") pod "5d041acc-48d2-4f2f-896f-94893b9ff41f" (UID: "5d041acc-48d2-4f2f-896f-94893b9ff41f"). InnerVolumeSpecName "kube-api-access-tw8rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.169103 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-utilities" (OuterVolumeSpecName: "utilities") pod "420d2432-4da0-4be3-8489-56c06a682e03" (UID: "420d2432-4da0-4be3-8489-56c06a682e03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.191197 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d041acc-48d2-4f2f-896f-94893b9ff41f" (UID: "5d041acc-48d2-4f2f-896f-94893b9ff41f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.209307 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "420d2432-4da0-4be3-8489-56c06a682e03" (UID: "420d2432-4da0-4be3-8489-56c06a682e03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.246821 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mvtrx"] Mar 18 18:09:39 crc kubenswrapper[5008]: W0318 18:09:39.253133 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36bc5b4f_3a28_4f99_9c2d_e521639c546a.slice/crio-d3fb39aeb8ba4645a7f8accc7aaa523cb436de4f3c067b6e04204d43100497bd WatchSource:0}: Error finding container d3fb39aeb8ba4645a7f8accc7aaa523cb436de4f3c067b6e04204d43100497bd: Status 404 returned error can't find the container with id d3fb39aeb8ba4645a7f8accc7aaa523cb436de4f3c067b6e04204d43100497bd Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.263582 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-trusted-ca\") pod \"c7314b49-e434-4c49-babe-7ebc3925639a\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.263658 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-catalog-content\") pod \"9209339f-be23-444c-a635-04920e6a0cf6\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.263702 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-utilities\") pod \"9209339f-be23-444c-a635-04920e6a0cf6\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.263722 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-catalog-content\") pod \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.263756 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-operator-metrics\") pod \"c7314b49-e434-4c49-babe-7ebc3925639a\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.263800 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rdtl\" (UniqueName: \"kubernetes.io/projected/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-kube-api-access-7rdtl\") pod \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.263844 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-utilities\") pod \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\" (UID: \"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.263861 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-947tz\" (UniqueName: \"kubernetes.io/projected/9209339f-be23-444c-a635-04920e6a0cf6-kube-api-access-947tz\") pod \"9209339f-be23-444c-a635-04920e6a0cf6\" (UID: \"9209339f-be23-444c-a635-04920e6a0cf6\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.263881 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hklz\" (UniqueName: \"kubernetes.io/projected/c7314b49-e434-4c49-babe-7ebc3925639a-kube-api-access-2hklz\") pod \"c7314b49-e434-4c49-babe-7ebc3925639a\" (UID: \"c7314b49-e434-4c49-babe-7ebc3925639a\") " Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.264061 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.264081 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw8rk\" (UniqueName: \"kubernetes.io/projected/5d041acc-48d2-4f2f-896f-94893b9ff41f-kube-api-access-tw8rk\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.264090 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.264101 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p7jc\" (UniqueName: \"kubernetes.io/projected/420d2432-4da0-4be3-8489-56c06a682e03-kube-api-access-5p7jc\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.264109 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d2432-4da0-4be3-8489-56c06a682e03-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.264117 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d041acc-48d2-4f2f-896f-94893b9ff41f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.266477 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-utilities" (OuterVolumeSpecName: "utilities") pod "9209339f-be23-444c-a635-04920e6a0cf6" (UID: "9209339f-be23-444c-a635-04920e6a0cf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.266650 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7314b49-e434-4c49-babe-7ebc3925639a-kube-api-access-2hklz" (OuterVolumeSpecName: "kube-api-access-2hklz") pod "c7314b49-e434-4c49-babe-7ebc3925639a" (UID: "c7314b49-e434-4c49-babe-7ebc3925639a"). InnerVolumeSpecName "kube-api-access-2hklz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.267137 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c7314b49-e434-4c49-babe-7ebc3925639a" (UID: "c7314b49-e434-4c49-babe-7ebc3925639a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.267240 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-utilities" (OuterVolumeSpecName: "utilities") pod "5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" (UID: "5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.267702 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-kube-api-access-7rdtl" (OuterVolumeSpecName: "kube-api-access-7rdtl") pod "5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" (UID: "5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3"). InnerVolumeSpecName "kube-api-access-7rdtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.270232 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c7314b49-e434-4c49-babe-7ebc3925639a" (UID: "c7314b49-e434-4c49-babe-7ebc3925639a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.272703 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9209339f-be23-444c-a635-04920e6a0cf6-kube-api-access-947tz" (OuterVolumeSpecName: "kube-api-access-947tz") pod "9209339f-be23-444c-a635-04920e6a0cf6" (UID: "9209339f-be23-444c-a635-04920e6a0cf6"). InnerVolumeSpecName "kube-api-access-947tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.323606 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9209339f-be23-444c-a635-04920e6a0cf6" (UID: "9209339f-be23-444c-a635-04920e6a0cf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.365671 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rdtl\" (UniqueName: \"kubernetes.io/projected/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-kube-api-access-7rdtl\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.365730 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.365748 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-947tz\" (UniqueName: \"kubernetes.io/projected/9209339f-be23-444c-a635-04920e6a0cf6-kube-api-access-947tz\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.365760 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hklz\" (UniqueName: \"kubernetes.io/projected/c7314b49-e434-4c49-babe-7ebc3925639a-kube-api-access-2hklz\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.365775 5008 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.365806 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.365817 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9209339f-be23-444c-a635-04920e6a0cf6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.365828 5008 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c7314b49-e434-4c49-babe-7ebc3925639a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.393477 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" (UID: "5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.467353 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.750625 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.750640 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tfg8c" event={"ID":"c7314b49-e434-4c49-babe-7ebc3925639a","Type":"ContainerDied","Data":"8c80ce51085fb7d45fefeea81af83e89a625c064305162972b17ab8dfc157624"} Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.751000 5008 scope.go:117] "RemoveContainer" containerID="04594d8db88de3721af003a18676c11a9781a11cfca8bc18ab32f21f57d8640c" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.752458 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" event={"ID":"36bc5b4f-3a28-4f99-9c2d-e521639c546a","Type":"ContainerStarted","Data":"c78c86c075774a3a19bcfa5315c8b6de98b6ed2f24cd9532fec5331e5e1903bd"} Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.752506 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" event={"ID":"36bc5b4f-3a28-4f99-9c2d-e521639c546a","Type":"ContainerStarted","Data":"d3fb39aeb8ba4645a7f8accc7aaa523cb436de4f3c067b6e04204d43100497bd"} Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.753069 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.754865 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xth4j" event={"ID":"420d2432-4da0-4be3-8489-56c06a682e03","Type":"ContainerDied","Data":"e2e936d049e1d78972d56c54c2a24e7ee96192ddf7a18ba76beb0e07359786b7"} Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.754929 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xth4j" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.761313 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrtz8" event={"ID":"5d041acc-48d2-4f2f-896f-94893b9ff41f","Type":"ContainerDied","Data":"331eb53ae51d183bc66d4665a6c1f8cf06d2165248a2e1d63cd24522cc621a29"} Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.761415 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrtz8" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.766348 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b59s5" event={"ID":"9209339f-be23-444c-a635-04920e6a0cf6","Type":"ContainerDied","Data":"ad829c9e61acd34611564eef21f78c28fdd5f6128296124ce6d56915ab89d1bb"} Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.766469 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b59s5" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.769945 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.771468 5008 generic.go:334] "Generic (PLEG): container finished" podID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerID="4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a" exitCode=0 Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.771622 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nhgk" event={"ID":"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3","Type":"ContainerDied","Data":"4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a"} Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.771692 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nhgk" event={"ID":"5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3","Type":"ContainerDied","Data":"ab3a12592bf5b5f4602c21176c74e03f52f1aea33b7702cb106ff77b80b513a6"} Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.771786 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nhgk" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.778523 5008 scope.go:117] "RemoveContainer" containerID="1cbac443129b9336bfcf92ffdb3a27b95779323162385c2b5960f203fd7ce80b" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.782300 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mvtrx" podStartSLOduration=1.782280745 podStartE2EDuration="1.782280745s" podCreationTimestamp="2026-03-18 18:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:09:39.777619946 +0000 UTC m=+436.297093035" watchObservedRunningTime="2026-03-18 18:09:39.782280745 +0000 UTC m=+436.301753834" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.807766 5008 scope.go:117] "RemoveContainer" containerID="2ae718b181bfb756e9349fe749313e583b1d1f9cbb4c77638751811927089977" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.837191 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrtz8"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.839156 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrtz8"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.850549 5008 scope.go:117] "RemoveContainer" containerID="f691da955768cca288b0fa8f50decdd516c251746e0870a9ba18c7ff53724add" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.855174 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5nhgk"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.866223 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5nhgk"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.869659 5008 scope.go:117] "RemoveContainer" containerID="5805c353901b2fa8e9bbcdf9c04063f82345b79b1305fbee5410a5bfa21d3062" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.870874 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tfg8c"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.878793 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tfg8c"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.883536 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xth4j"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.887740 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xth4j"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.889397 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b59s5"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.894317 5008 scope.go:117] "RemoveContainer" containerID="83d7470e7ce2ee42c888f13b5447487b65b48243a89b4b9f0607048c0a6ad1ac" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.895115 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b59s5"] Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.918287 5008 scope.go:117] "RemoveContainer" containerID="8faaf40a7bbc945d5b2e99f0545c58c4cea99f59920c163c286bfdd46270980d" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.934494 5008 scope.go:117] "RemoveContainer" containerID="2ce79d08f9d51b90cdf84677ccf94509cc2f99e596171e28bb256d4623a0adc1" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.946085 5008 scope.go:117] "RemoveContainer" containerID="6adb0576c5913e8c7712e8b51d776ae31be24f4483671b3746847b27180443e0" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.958474 5008 scope.go:117] "RemoveContainer" containerID="ce484199224a95f43dfbc70cf47c7daa3b81ccff4004f4b966d10d8f745b60c7" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.970775 5008 scope.go:117] "RemoveContainer" containerID="4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a" Mar 18 18:09:39 crc kubenswrapper[5008]: I0318 18:09:39.993510 5008 scope.go:117] "RemoveContainer" containerID="c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.011437 5008 scope.go:117] "RemoveContainer" containerID="7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.034038 5008 scope.go:117] "RemoveContainer" containerID="4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.034472 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a\": container with ID starting with 4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a not found: ID does not exist" containerID="4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.034545 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a"} err="failed to get container status \"4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a\": rpc error: code = NotFound desc = could not find container \"4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a\": container with ID starting with 4d5b51677a2fd68f58f8d49ac4481dd3d7a09866c4390d1f2419ba024e260f8a not found: ID does not exist" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.034616 5008 scope.go:117] "RemoveContainer" containerID="c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.035102 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f\": container with ID starting with c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f not found: ID does not exist" containerID="c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.035139 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f"} err="failed to get container status \"c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f\": rpc error: code = NotFound desc = could not find container \"c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f\": container with ID starting with c72f2aa1b5d6f142979dd61a8c7dfa3b1138f23927b9bc8005cd93d927b45a4f not found: ID does not exist" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.035169 5008 scope.go:117] "RemoveContainer" containerID="7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.035583 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372\": container with ID starting with 7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372 not found: ID does not exist" containerID="7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.035629 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372"} err="failed to get container status \"7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372\": rpc error: code = NotFound desc = could not find container \"7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372\": container with ID starting with 7be5c5242b365721ff4748c242c805063aa1ebb7d291a596d59446ec9a9f4372 not found: ID does not exist" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.220794 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="420d2432-4da0-4be3-8489-56c06a682e03" path="/var/lib/kubelet/pods/420d2432-4da0-4be3-8489-56c06a682e03/volumes" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.222017 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" path="/var/lib/kubelet/pods/5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3/volumes" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.222732 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" path="/var/lib/kubelet/pods/5d041acc-48d2-4f2f-896f-94893b9ff41f/volumes" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.223937 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9209339f-be23-444c-a635-04920e6a0cf6" path="/var/lib/kubelet/pods/9209339f-be23-444c-a635-04920e6a0cf6/volumes" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.224637 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" path="/var/lib/kubelet/pods/c7314b49-e434-4c49-babe-7ebc3925639a/volumes" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.771698 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nfbpl"] Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.771931 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9209339f-be23-444c-a635-04920e6a0cf6" containerName="extract-utilities" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.771947 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9209339f-be23-444c-a635-04920e6a0cf6" containerName="extract-utilities" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.771957 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420d2432-4da0-4be3-8489-56c06a682e03" containerName="extract-content" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.771964 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="420d2432-4da0-4be3-8489-56c06a682e03" containerName="extract-content" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.771975 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.771983 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.771996 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerName="extract-content" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772003 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerName="extract-content" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772014 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" containerName="marketplace-operator" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772021 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" containerName="marketplace-operator" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772030 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9209339f-be23-444c-a635-04920e6a0cf6" containerName="extract-content" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772039 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9209339f-be23-444c-a635-04920e6a0cf6" containerName="extract-content" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772052 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9209339f-be23-444c-a635-04920e6a0cf6" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772060 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9209339f-be23-444c-a635-04920e6a0cf6" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772071 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerName="extract-utilities" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772079 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerName="extract-utilities" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772089 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420d2432-4da0-4be3-8489-56c06a682e03" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772096 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="420d2432-4da0-4be3-8489-56c06a682e03" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772106 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerName="extract-content" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772112 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerName="extract-content" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772123 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420d2432-4da0-4be3-8489-56c06a682e03" containerName="extract-utilities" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772129 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="420d2432-4da0-4be3-8489-56c06a682e03" containerName="extract-utilities" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772140 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772147 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772157 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerName="extract-utilities" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772163 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerName="extract-utilities" Mar 18 18:09:40 crc kubenswrapper[5008]: E0318 18:09:40.772173 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" containerName="marketplace-operator" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772179 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" containerName="marketplace-operator" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772291 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" containerName="marketplace-operator" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772303 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d041acc-48d2-4f2f-896f-94893b9ff41f" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772316 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="420d2432-4da0-4be3-8489-56c06a682e03" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772326 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a5d0987-3c0d-4e5d-959c-65e7dbacb6e3" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772337 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9209339f-be23-444c-a635-04920e6a0cf6" containerName="registry-server" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.772538 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7314b49-e434-4c49-babe-7ebc3925639a" containerName="marketplace-operator" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.773253 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.775127 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.780373 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfbpl"] Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.788278 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15786344-bb69-48e3-8bc9-e7fea7106bc8-utilities\") pod \"redhat-marketplace-nfbpl\" (UID: \"15786344-bb69-48e3-8bc9-e7fea7106bc8\") " pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.788351 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15786344-bb69-48e3-8bc9-e7fea7106bc8-catalog-content\") pod \"redhat-marketplace-nfbpl\" (UID: \"15786344-bb69-48e3-8bc9-e7fea7106bc8\") " pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.788397 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7jlp\" (UniqueName: \"kubernetes.io/projected/15786344-bb69-48e3-8bc9-e7fea7106bc8-kube-api-access-f7jlp\") pod \"redhat-marketplace-nfbpl\" (UID: \"15786344-bb69-48e3-8bc9-e7fea7106bc8\") " pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.888711 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15786344-bb69-48e3-8bc9-e7fea7106bc8-catalog-content\") pod \"redhat-marketplace-nfbpl\" (UID: \"15786344-bb69-48e3-8bc9-e7fea7106bc8\") " pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.888771 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7jlp\" (UniqueName: \"kubernetes.io/projected/15786344-bb69-48e3-8bc9-e7fea7106bc8-kube-api-access-f7jlp\") pod \"redhat-marketplace-nfbpl\" (UID: \"15786344-bb69-48e3-8bc9-e7fea7106bc8\") " pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.888828 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15786344-bb69-48e3-8bc9-e7fea7106bc8-utilities\") pod \"redhat-marketplace-nfbpl\" (UID: \"15786344-bb69-48e3-8bc9-e7fea7106bc8\") " pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.889283 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15786344-bb69-48e3-8bc9-e7fea7106bc8-catalog-content\") pod \"redhat-marketplace-nfbpl\" (UID: \"15786344-bb69-48e3-8bc9-e7fea7106bc8\") " pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.889321 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15786344-bb69-48e3-8bc9-e7fea7106bc8-utilities\") pod \"redhat-marketplace-nfbpl\" (UID: \"15786344-bb69-48e3-8bc9-e7fea7106bc8\") " pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.904968 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7jlp\" (UniqueName: \"kubernetes.io/projected/15786344-bb69-48e3-8bc9-e7fea7106bc8-kube-api-access-f7jlp\") pod \"redhat-marketplace-nfbpl\" (UID: \"15786344-bb69-48e3-8bc9-e7fea7106bc8\") " pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.967204 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dhkrf"] Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.970837 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.972994 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 18:09:40 crc kubenswrapper[5008]: I0318 18:09:40.973055 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dhkrf"] Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.091060 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069bee86-2582-433b-a7e9-59fa22bb650d-catalog-content\") pod \"certified-operators-dhkrf\" (UID: \"069bee86-2582-433b-a7e9-59fa22bb650d\") " pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.091240 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b76d\" (UniqueName: \"kubernetes.io/projected/069bee86-2582-433b-a7e9-59fa22bb650d-kube-api-access-9b76d\") pod \"certified-operators-dhkrf\" (UID: \"069bee86-2582-433b-a7e9-59fa22bb650d\") " pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.091307 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069bee86-2582-433b-a7e9-59fa22bb650d-utilities\") pod \"certified-operators-dhkrf\" (UID: \"069bee86-2582-433b-a7e9-59fa22bb650d\") " pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.094712 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.193096 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069bee86-2582-433b-a7e9-59fa22bb650d-catalog-content\") pod \"certified-operators-dhkrf\" (UID: \"069bee86-2582-433b-a7e9-59fa22bb650d\") " pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.193473 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b76d\" (UniqueName: \"kubernetes.io/projected/069bee86-2582-433b-a7e9-59fa22bb650d-kube-api-access-9b76d\") pod \"certified-operators-dhkrf\" (UID: \"069bee86-2582-433b-a7e9-59fa22bb650d\") " pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.193505 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069bee86-2582-433b-a7e9-59fa22bb650d-utilities\") pod \"certified-operators-dhkrf\" (UID: \"069bee86-2582-433b-a7e9-59fa22bb650d\") " pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.193899 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069bee86-2582-433b-a7e9-59fa22bb650d-utilities\") pod \"certified-operators-dhkrf\" (UID: \"069bee86-2582-433b-a7e9-59fa22bb650d\") " pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.193928 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069bee86-2582-433b-a7e9-59fa22bb650d-catalog-content\") pod \"certified-operators-dhkrf\" (UID: \"069bee86-2582-433b-a7e9-59fa22bb650d\") " pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.213282 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b76d\" (UniqueName: \"kubernetes.io/projected/069bee86-2582-433b-a7e9-59fa22bb650d-kube-api-access-9b76d\") pod \"certified-operators-dhkrf\" (UID: \"069bee86-2582-433b-a7e9-59fa22bb650d\") " pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.269222 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfbpl"] Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.285862 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.471306 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dhkrf"] Mar 18 18:09:41 crc kubenswrapper[5008]: W0318 18:09:41.501877 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod069bee86_2582_433b_a7e9_59fa22bb650d.slice/crio-7626cb9a516a8f07e038bc27eca2ffe47dcea4eb8a833d0439475a107cbb0b67 WatchSource:0}: Error finding container 7626cb9a516a8f07e038bc27eca2ffe47dcea4eb8a833d0439475a107cbb0b67: Status 404 returned error can't find the container with id 7626cb9a516a8f07e038bc27eca2ffe47dcea4eb8a833d0439475a107cbb0b67 Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.787388 5008 generic.go:334] "Generic (PLEG): container finished" podID="069bee86-2582-433b-a7e9-59fa22bb650d" containerID="e1fce6d190ccc6eead809fd939bbb9de3d3ee62464838b214eb16499fa1af0a5" exitCode=0 Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.787456 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhkrf" event={"ID":"069bee86-2582-433b-a7e9-59fa22bb650d","Type":"ContainerDied","Data":"e1fce6d190ccc6eead809fd939bbb9de3d3ee62464838b214eb16499fa1af0a5"} Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.787481 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhkrf" event={"ID":"069bee86-2582-433b-a7e9-59fa22bb650d","Type":"ContainerStarted","Data":"7626cb9a516a8f07e038bc27eca2ffe47dcea4eb8a833d0439475a107cbb0b67"} Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.788260 5008 generic.go:334] "Generic (PLEG): container finished" podID="15786344-bb69-48e3-8bc9-e7fea7106bc8" containerID="df4d7e5bf53222d7f9f44e8647216ab83368abcd4fc88d11e35495cc33b07fdb" exitCode=0 Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.788432 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfbpl" event={"ID":"15786344-bb69-48e3-8bc9-e7fea7106bc8","Type":"ContainerDied","Data":"df4d7e5bf53222d7f9f44e8647216ab83368abcd4fc88d11e35495cc33b07fdb"} Mar 18 18:09:41 crc kubenswrapper[5008]: I0318 18:09:41.788476 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfbpl" event={"ID":"15786344-bb69-48e3-8bc9-e7fea7106bc8","Type":"ContainerStarted","Data":"7ab03d9820604fc7451c72b25ff9f9999d67ef901d4401d2eb97de364d03830d"} Mar 18 18:09:42 crc kubenswrapper[5008]: I0318 18:09:42.796047 5008 generic.go:334] "Generic (PLEG): container finished" podID="069bee86-2582-433b-a7e9-59fa22bb650d" containerID="7c96832ee619e25675780ff6d612d1c040ec63e3879561e39e3da0567846ab94" exitCode=0 Mar 18 18:09:42 crc kubenswrapper[5008]: I0318 18:09:42.796119 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhkrf" event={"ID":"069bee86-2582-433b-a7e9-59fa22bb650d","Type":"ContainerDied","Data":"7c96832ee619e25675780ff6d612d1c040ec63e3879561e39e3da0567846ab94"} Mar 18 18:09:42 crc kubenswrapper[5008]: I0318 18:09:42.799493 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfbpl" event={"ID":"15786344-bb69-48e3-8bc9-e7fea7106bc8","Type":"ContainerStarted","Data":"2a7aec4ed79859328a9905fa0fbaa776b0bea4d169241f7139cae6a4be7a060e"} Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.172175 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jddwj"] Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.173649 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.182174 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.189706 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jddwj"] Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.221068 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rnc9\" (UniqueName: \"kubernetes.io/projected/b857b251-c73d-4153-8155-4ddd0703759b-kube-api-access-8rnc9\") pod \"redhat-operators-jddwj\" (UID: \"b857b251-c73d-4153-8155-4ddd0703759b\") " pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.221129 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b857b251-c73d-4153-8155-4ddd0703759b-catalog-content\") pod \"redhat-operators-jddwj\" (UID: \"b857b251-c73d-4153-8155-4ddd0703759b\") " pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.221178 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b857b251-c73d-4153-8155-4ddd0703759b-utilities\") pod \"redhat-operators-jddwj\" (UID: \"b857b251-c73d-4153-8155-4ddd0703759b\") " pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.322383 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rnc9\" (UniqueName: \"kubernetes.io/projected/b857b251-c73d-4153-8155-4ddd0703759b-kube-api-access-8rnc9\") pod \"redhat-operators-jddwj\" (UID: \"b857b251-c73d-4153-8155-4ddd0703759b\") " pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.322443 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b857b251-c73d-4153-8155-4ddd0703759b-catalog-content\") pod \"redhat-operators-jddwj\" (UID: \"b857b251-c73d-4153-8155-4ddd0703759b\") " pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.322484 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b857b251-c73d-4153-8155-4ddd0703759b-utilities\") pod \"redhat-operators-jddwj\" (UID: \"b857b251-c73d-4153-8155-4ddd0703759b\") " pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.322930 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b857b251-c73d-4153-8155-4ddd0703759b-utilities\") pod \"redhat-operators-jddwj\" (UID: \"b857b251-c73d-4153-8155-4ddd0703759b\") " pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.323087 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b857b251-c73d-4153-8155-4ddd0703759b-catalog-content\") pod \"redhat-operators-jddwj\" (UID: \"b857b251-c73d-4153-8155-4ddd0703759b\") " pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.339534 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rnc9\" (UniqueName: \"kubernetes.io/projected/b857b251-c73d-4153-8155-4ddd0703759b-kube-api-access-8rnc9\") pod \"redhat-operators-jddwj\" (UID: \"b857b251-c73d-4153-8155-4ddd0703759b\") " pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.362057 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qz9gd"] Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.363120 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.364951 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.379224 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qz9gd"] Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.423364 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmjtv\" (UniqueName: \"kubernetes.io/projected/be98c2c7-e6a9-4459-9693-6c05374daff8-kube-api-access-xmjtv\") pod \"community-operators-qz9gd\" (UID: \"be98c2c7-e6a9-4459-9693-6c05374daff8\") " pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.423459 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be98c2c7-e6a9-4459-9693-6c05374daff8-catalog-content\") pod \"community-operators-qz9gd\" (UID: \"be98c2c7-e6a9-4459-9693-6c05374daff8\") " pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.423541 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be98c2c7-e6a9-4459-9693-6c05374daff8-utilities\") pod \"community-operators-qz9gd\" (UID: \"be98c2c7-e6a9-4459-9693-6c05374daff8\") " pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.511131 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.524184 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmjtv\" (UniqueName: \"kubernetes.io/projected/be98c2c7-e6a9-4459-9693-6c05374daff8-kube-api-access-xmjtv\") pod \"community-operators-qz9gd\" (UID: \"be98c2c7-e6a9-4459-9693-6c05374daff8\") " pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.524235 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be98c2c7-e6a9-4459-9693-6c05374daff8-catalog-content\") pod \"community-operators-qz9gd\" (UID: \"be98c2c7-e6a9-4459-9693-6c05374daff8\") " pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.524260 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be98c2c7-e6a9-4459-9693-6c05374daff8-utilities\") pod \"community-operators-qz9gd\" (UID: \"be98c2c7-e6a9-4459-9693-6c05374daff8\") " pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.524766 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be98c2c7-e6a9-4459-9693-6c05374daff8-utilities\") pod \"community-operators-qz9gd\" (UID: \"be98c2c7-e6a9-4459-9693-6c05374daff8\") " pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.525087 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be98c2c7-e6a9-4459-9693-6c05374daff8-catalog-content\") pod \"community-operators-qz9gd\" (UID: \"be98c2c7-e6a9-4459-9693-6c05374daff8\") " pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.553059 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmjtv\" (UniqueName: \"kubernetes.io/projected/be98c2c7-e6a9-4459-9693-6c05374daff8-kube-api-access-xmjtv\") pod \"community-operators-qz9gd\" (UID: \"be98c2c7-e6a9-4459-9693-6c05374daff8\") " pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.731698 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.807294 5008 generic.go:334] "Generic (PLEG): container finished" podID="15786344-bb69-48e3-8bc9-e7fea7106bc8" containerID="2a7aec4ed79859328a9905fa0fbaa776b0bea4d169241f7139cae6a4be7a060e" exitCode=0 Mar 18 18:09:43 crc kubenswrapper[5008]: I0318 18:09:43.807339 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfbpl" event={"ID":"15786344-bb69-48e3-8bc9-e7fea7106bc8","Type":"ContainerDied","Data":"2a7aec4ed79859328a9905fa0fbaa776b0bea4d169241f7139cae6a4be7a060e"} Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:43.903022 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wb6qx" Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:43.930390 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qz9gd"] Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:43.953864 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5gw26"] Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:43.968161 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jddwj"] Mar 18 18:09:44 crc kubenswrapper[5008]: W0318 18:09:43.988011 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb857b251_c73d_4153_8155_4ddd0703759b.slice/crio-94fb3cef0a11b0a9733427257ffce1ee8ac8d80eaf7eb930c74a077a9f5931d2 WatchSource:0}: Error finding container 94fb3cef0a11b0a9733427257ffce1ee8ac8d80eaf7eb930c74a077a9f5931d2: Status 404 returned error can't find the container with id 94fb3cef0a11b0a9733427257ffce1ee8ac8d80eaf7eb930c74a077a9f5931d2 Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.815426 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfbpl" event={"ID":"15786344-bb69-48e3-8bc9-e7fea7106bc8","Type":"ContainerStarted","Data":"2ae32acf808fd3682a2e6581defd37f5c311447c76076ae99b7eef8f8c47e9eb"} Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.817021 5008 generic.go:334] "Generic (PLEG): container finished" podID="be98c2c7-e6a9-4459-9693-6c05374daff8" containerID="368ec15b643f6c13f97f76934f018970cf68cfd6a1c00a40e55fc128a4ec413c" exitCode=0 Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.817129 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qz9gd" event={"ID":"be98c2c7-e6a9-4459-9693-6c05374daff8","Type":"ContainerDied","Data":"368ec15b643f6c13f97f76934f018970cf68cfd6a1c00a40e55fc128a4ec413c"} Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.817185 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qz9gd" event={"ID":"be98c2c7-e6a9-4459-9693-6c05374daff8","Type":"ContainerStarted","Data":"509e9ec3dbffbe6fe9a0a4bd7af094f809475218b665fe447d7c0bd4a61494d6"} Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.819924 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dhkrf" event={"ID":"069bee86-2582-433b-a7e9-59fa22bb650d","Type":"ContainerStarted","Data":"6512a9f359360872ca4a2677d09684951a4ee1f2d8c59ce7d3348859db0ab02e"} Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.822365 5008 generic.go:334] "Generic (PLEG): container finished" podID="b857b251-c73d-4153-8155-4ddd0703759b" containerID="a86e25f10c17d71da1b5701a7ab40767186f9e70896be1b5aa40a48635923b1e" exitCode=0 Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.822399 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jddwj" event={"ID":"b857b251-c73d-4153-8155-4ddd0703759b","Type":"ContainerDied","Data":"a86e25f10c17d71da1b5701a7ab40767186f9e70896be1b5aa40a48635923b1e"} Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.822415 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jddwj" event={"ID":"b857b251-c73d-4153-8155-4ddd0703759b","Type":"ContainerStarted","Data":"94fb3cef0a11b0a9733427257ffce1ee8ac8d80eaf7eb930c74a077a9f5931d2"} Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.839962 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nfbpl" podStartSLOduration=2.206211885 podStartE2EDuration="4.839938857s" podCreationTimestamp="2026-03-18 18:09:40 +0000 UTC" firstStartedPulling="2026-03-18 18:09:41.793549695 +0000 UTC m=+438.313022784" lastFinishedPulling="2026-03-18 18:09:44.427276677 +0000 UTC m=+440.946749756" observedRunningTime="2026-03-18 18:09:44.83850011 +0000 UTC m=+441.357973189" watchObservedRunningTime="2026-03-18 18:09:44.839938857 +0000 UTC m=+441.359411946" Mar 18 18:09:44 crc kubenswrapper[5008]: I0318 18:09:44.859961 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dhkrf" podStartSLOduration=2.950768163 podStartE2EDuration="4.859946261s" podCreationTimestamp="2026-03-18 18:09:40 +0000 UTC" firstStartedPulling="2026-03-18 18:09:41.788940127 +0000 UTC m=+438.308413206" lastFinishedPulling="2026-03-18 18:09:43.698118185 +0000 UTC m=+440.217591304" observedRunningTime="2026-03-18 18:09:44.857220811 +0000 UTC m=+441.376693900" watchObservedRunningTime="2026-03-18 18:09:44.859946261 +0000 UTC m=+441.379419340" Mar 18 18:09:46 crc kubenswrapper[5008]: I0318 18:09:46.843764 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qz9gd" event={"ID":"be98c2c7-e6a9-4459-9693-6c05374daff8","Type":"ContainerStarted","Data":"b9baadbe7a9ddbe57e598b47a8b7c5ac0b7799878e445c098c408a3b6e49bd4f"} Mar 18 18:09:47 crc kubenswrapper[5008]: I0318 18:09:47.853655 5008 generic.go:334] "Generic (PLEG): container finished" podID="b857b251-c73d-4153-8155-4ddd0703759b" containerID="f0392eba9023a9c753e9873dce990f3b3b45ed9ed6b1eb48d2d94a6232c5d600" exitCode=0 Mar 18 18:09:47 crc kubenswrapper[5008]: I0318 18:09:47.853760 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jddwj" event={"ID":"b857b251-c73d-4153-8155-4ddd0703759b","Type":"ContainerDied","Data":"f0392eba9023a9c753e9873dce990f3b3b45ed9ed6b1eb48d2d94a6232c5d600"} Mar 18 18:09:47 crc kubenswrapper[5008]: I0318 18:09:47.857514 5008 generic.go:334] "Generic (PLEG): container finished" podID="be98c2c7-e6a9-4459-9693-6c05374daff8" containerID="b9baadbe7a9ddbe57e598b47a8b7c5ac0b7799878e445c098c408a3b6e49bd4f" exitCode=0 Mar 18 18:09:47 crc kubenswrapper[5008]: I0318 18:09:47.857580 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qz9gd" event={"ID":"be98c2c7-e6a9-4459-9693-6c05374daff8","Type":"ContainerDied","Data":"b9baadbe7a9ddbe57e598b47a8b7c5ac0b7799878e445c098c408a3b6e49bd4f"} Mar 18 18:09:48 crc kubenswrapper[5008]: I0318 18:09:48.870011 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jddwj" event={"ID":"b857b251-c73d-4153-8155-4ddd0703759b","Type":"ContainerStarted","Data":"b6ead761569753e94b45c5e7f366c974e21d16f13eda267ab2b09ef25b79a17a"} Mar 18 18:09:48 crc kubenswrapper[5008]: I0318 18:09:48.877312 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qz9gd" event={"ID":"be98c2c7-e6a9-4459-9693-6c05374daff8","Type":"ContainerStarted","Data":"8b3460f1a6580208b2856913d35c5abff8d5990a2270737f557d906629c3296f"} Mar 18 18:09:48 crc kubenswrapper[5008]: I0318 18:09:48.910684 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jddwj" podStartSLOduration=2.44533243 podStartE2EDuration="5.910642005s" podCreationTimestamp="2026-03-18 18:09:43 +0000 UTC" firstStartedPulling="2026-03-18 18:09:44.825590559 +0000 UTC m=+441.345063638" lastFinishedPulling="2026-03-18 18:09:48.290900134 +0000 UTC m=+444.810373213" observedRunningTime="2026-03-18 18:09:48.902038254 +0000 UTC m=+445.421511373" watchObservedRunningTime="2026-03-18 18:09:48.910642005 +0000 UTC m=+445.430115154" Mar 18 18:09:48 crc kubenswrapper[5008]: I0318 18:09:48.943438 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qz9gd" podStartSLOduration=2.497276164 podStartE2EDuration="5.943413107s" podCreationTimestamp="2026-03-18 18:09:43 +0000 UTC" firstStartedPulling="2026-03-18 18:09:44.819348108 +0000 UTC m=+441.338821187" lastFinishedPulling="2026-03-18 18:09:48.265485051 +0000 UTC m=+444.784958130" observedRunningTime="2026-03-18 18:09:48.938951052 +0000 UTC m=+445.458424171" watchObservedRunningTime="2026-03-18 18:09:48.943413107 +0000 UTC m=+445.462886206" Mar 18 18:09:51 crc kubenswrapper[5008]: I0318 18:09:51.095882 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:51 crc kubenswrapper[5008]: I0318 18:09:51.096477 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:51 crc kubenswrapper[5008]: I0318 18:09:51.175610 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:51 crc kubenswrapper[5008]: I0318 18:09:51.286210 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:51 crc kubenswrapper[5008]: I0318 18:09:51.286301 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:51 crc kubenswrapper[5008]: I0318 18:09:51.341164 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:51 crc kubenswrapper[5008]: I0318 18:09:51.974888 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nfbpl" Mar 18 18:09:51 crc kubenswrapper[5008]: I0318 18:09:51.974966 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dhkrf" Mar 18 18:09:53 crc kubenswrapper[5008]: I0318 18:09:53.511668 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:53 crc kubenswrapper[5008]: I0318 18:09:53.512248 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:09:53 crc kubenswrapper[5008]: I0318 18:09:53.733343 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:53 crc kubenswrapper[5008]: I0318 18:09:53.733459 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:53 crc kubenswrapper[5008]: I0318 18:09:53.802832 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:53 crc kubenswrapper[5008]: I0318 18:09:53.979911 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qz9gd" Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.460672 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.460749 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.460798 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.461453 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"265c93bd38176e028a3b20d735aef8eb6b45124abbc855b3703820d202fa1f53"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.461516 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://265c93bd38176e028a3b20d735aef8eb6b45124abbc855b3703820d202fa1f53" gracePeriod=600 Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.587298 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jddwj" podUID="b857b251-c73d-4153-8155-4ddd0703759b" containerName="registry-server" probeResult="failure" output=< Mar 18 18:09:54 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 18:09:54 crc kubenswrapper[5008]: > Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.920130 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="265c93bd38176e028a3b20d735aef8eb6b45124abbc855b3703820d202fa1f53" exitCode=0 Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.920207 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"265c93bd38176e028a3b20d735aef8eb6b45124abbc855b3703820d202fa1f53"} Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.920443 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"4ccc64188129ed87aa4fe6905ebfe183cf7aa5b27085343f7bce874d89c57b22"} Mar 18 18:09:54 crc kubenswrapper[5008]: I0318 18:09:54.920481 5008 scope.go:117] "RemoveContainer" containerID="39082231274a47ab82bcfd1a9e57bf1aad4115d3baa10c788cd47e4b7d9b02f7" Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.151745 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564290-8znfx"] Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.153635 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-8znfx" Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.156274 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.157035 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.157274 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.179494 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-8znfx"] Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.248450 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmxs7\" (UniqueName: \"kubernetes.io/projected/469ffaa4-8b51-436c-bdf8-d43dcbff1fcd-kube-api-access-pmxs7\") pod \"auto-csr-approver-29564290-8znfx\" (UID: \"469ffaa4-8b51-436c-bdf8-d43dcbff1fcd\") " pod="openshift-infra/auto-csr-approver-29564290-8znfx" Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.349844 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmxs7\" (UniqueName: \"kubernetes.io/projected/469ffaa4-8b51-436c-bdf8-d43dcbff1fcd-kube-api-access-pmxs7\") pod \"auto-csr-approver-29564290-8znfx\" (UID: \"469ffaa4-8b51-436c-bdf8-d43dcbff1fcd\") " pod="openshift-infra/auto-csr-approver-29564290-8znfx" Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.384194 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmxs7\" (UniqueName: \"kubernetes.io/projected/469ffaa4-8b51-436c-bdf8-d43dcbff1fcd-kube-api-access-pmxs7\") pod \"auto-csr-approver-29564290-8znfx\" (UID: \"469ffaa4-8b51-436c-bdf8-d43dcbff1fcd\") " pod="openshift-infra/auto-csr-approver-29564290-8znfx" Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.480042 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-8znfx" Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.926759 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-8znfx"] Mar 18 18:10:00 crc kubenswrapper[5008]: W0318 18:10:00.944634 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod469ffaa4_8b51_436c_bdf8_d43dcbff1fcd.slice/crio-a95b169f51a455d47707b1c797ffd9e9463e157d665d5dd8024bc0add3d16237 WatchSource:0}: Error finding container a95b169f51a455d47707b1c797ffd9e9463e157d665d5dd8024bc0add3d16237: Status 404 returned error can't find the container with id a95b169f51a455d47707b1c797ffd9e9463e157d665d5dd8024bc0add3d16237 Mar 18 18:10:00 crc kubenswrapper[5008]: I0318 18:10:00.967082 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-8znfx" event={"ID":"469ffaa4-8b51-436c-bdf8-d43dcbff1fcd","Type":"ContainerStarted","Data":"a95b169f51a455d47707b1c797ffd9e9463e157d665d5dd8024bc0add3d16237"} Mar 18 18:10:02 crc kubenswrapper[5008]: I0318 18:10:02.982297 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-8znfx" event={"ID":"469ffaa4-8b51-436c-bdf8-d43dcbff1fcd","Type":"ContainerStarted","Data":"e2234aa472c18db948eca55728aafb0b9ce11cc668725b780d9ab170bb87bdc3"} Mar 18 18:10:03 crc kubenswrapper[5008]: I0318 18:10:03.002756 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564290-8znfx" podStartSLOduration=1.392168179 podStartE2EDuration="3.002734187s" podCreationTimestamp="2026-03-18 18:10:00 +0000 UTC" firstStartedPulling="2026-03-18 18:10:00.950474527 +0000 UTC m=+457.469947606" lastFinishedPulling="2026-03-18 18:10:02.561040535 +0000 UTC m=+459.080513614" observedRunningTime="2026-03-18 18:10:02.998264385 +0000 UTC m=+459.517737504" watchObservedRunningTime="2026-03-18 18:10:03.002734187 +0000 UTC m=+459.522207276" Mar 18 18:10:03 crc kubenswrapper[5008]: I0318 18:10:03.580990 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:10:03 crc kubenswrapper[5008]: I0318 18:10:03.640411 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jddwj" Mar 18 18:10:03 crc kubenswrapper[5008]: I0318 18:10:03.993835 5008 generic.go:334] "Generic (PLEG): container finished" podID="469ffaa4-8b51-436c-bdf8-d43dcbff1fcd" containerID="e2234aa472c18db948eca55728aafb0b9ce11cc668725b780d9ab170bb87bdc3" exitCode=0 Mar 18 18:10:03 crc kubenswrapper[5008]: I0318 18:10:03.994045 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-8znfx" event={"ID":"469ffaa4-8b51-436c-bdf8-d43dcbff1fcd","Type":"ContainerDied","Data":"e2234aa472c18db948eca55728aafb0b9ce11cc668725b780d9ab170bb87bdc3"} Mar 18 18:10:05 crc kubenswrapper[5008]: I0318 18:10:05.284176 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-8znfx" Mar 18 18:10:05 crc kubenswrapper[5008]: I0318 18:10:05.420285 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmxs7\" (UniqueName: \"kubernetes.io/projected/469ffaa4-8b51-436c-bdf8-d43dcbff1fcd-kube-api-access-pmxs7\") pod \"469ffaa4-8b51-436c-bdf8-d43dcbff1fcd\" (UID: \"469ffaa4-8b51-436c-bdf8-d43dcbff1fcd\") " Mar 18 18:10:05 crc kubenswrapper[5008]: I0318 18:10:05.439994 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469ffaa4-8b51-436c-bdf8-d43dcbff1fcd-kube-api-access-pmxs7" (OuterVolumeSpecName: "kube-api-access-pmxs7") pod "469ffaa4-8b51-436c-bdf8-d43dcbff1fcd" (UID: "469ffaa4-8b51-436c-bdf8-d43dcbff1fcd"). InnerVolumeSpecName "kube-api-access-pmxs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:10:05 crc kubenswrapper[5008]: I0318 18:10:05.521831 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmxs7\" (UniqueName: \"kubernetes.io/projected/469ffaa4-8b51-436c-bdf8-d43dcbff1fcd-kube-api-access-pmxs7\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:06 crc kubenswrapper[5008]: I0318 18:10:06.012146 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564290-8znfx" event={"ID":"469ffaa4-8b51-436c-bdf8-d43dcbff1fcd","Type":"ContainerDied","Data":"a95b169f51a455d47707b1c797ffd9e9463e157d665d5dd8024bc0add3d16237"} Mar 18 18:10:06 crc kubenswrapper[5008]: I0318 18:10:06.012878 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a95b169f51a455d47707b1c797ffd9e9463e157d665d5dd8024bc0add3d16237" Mar 18 18:10:06 crc kubenswrapper[5008]: I0318 18:10:06.012244 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564290-8znfx" Mar 18 18:10:06 crc kubenswrapper[5008]: I0318 18:10:06.071942 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564284-6tsrt"] Mar 18 18:10:06 crc kubenswrapper[5008]: I0318 18:10:06.075759 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564284-6tsrt"] Mar 18 18:10:06 crc kubenswrapper[5008]: I0318 18:10:06.206822 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d04574-0631-403a-8bf5-4127787463d7" path="/var/lib/kubelet/pods/51d04574-0631-403a-8bf5-4127787463d7/volumes" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.100288 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" podUID="d02d52ba-4ba4-47b2-b0f3-a769e009d161" containerName="registry" containerID="cri-o://6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189" gracePeriod=30 Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.432050 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.590342 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d02d52ba-4ba4-47b2-b0f3-a769e009d161-ca-trust-extracted\") pod \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.597661 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-trusted-ca\") pod \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.597750 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d02d52ba-4ba4-47b2-b0f3-a769e009d161-installation-pull-secrets\") pod \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.597777 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-bound-sa-token\") pod \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.597825 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg8nb\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-kube-api-access-wg8nb\") pod \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.597848 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-tls\") pod \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.597879 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-certificates\") pod \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.597999 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\" (UID: \"d02d52ba-4ba4-47b2-b0f3-a769e009d161\") " Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.600787 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d02d52ba-4ba4-47b2-b0f3-a769e009d161" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.601213 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.601380 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d02d52ba-4ba4-47b2-b0f3-a769e009d161" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.608829 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02d52ba-4ba4-47b2-b0f3-a769e009d161-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d02d52ba-4ba4-47b2-b0f3-a769e009d161" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.612084 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-kube-api-access-wg8nb" (OuterVolumeSpecName: "kube-api-access-wg8nb") pod "d02d52ba-4ba4-47b2-b0f3-a769e009d161" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161"). InnerVolumeSpecName "kube-api-access-wg8nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.612884 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d02d52ba-4ba4-47b2-b0f3-a769e009d161" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.613746 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02d52ba-4ba4-47b2-b0f3-a769e009d161-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d02d52ba-4ba4-47b2-b0f3-a769e009d161" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.615602 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d02d52ba-4ba4-47b2-b0f3-a769e009d161" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.619894 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d02d52ba-4ba4-47b2-b0f3-a769e009d161" (UID: "d02d52ba-4ba4-47b2-b0f3-a769e009d161"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.702504 5008 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d02d52ba-4ba4-47b2-b0f3-a769e009d161-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.702582 5008 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d02d52ba-4ba4-47b2-b0f3-a769e009d161-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.702601 5008 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.702615 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg8nb\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-kube-api-access-wg8nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.702631 5008 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:09 crc kubenswrapper[5008]: I0318 18:10:09.702643 5008 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d02d52ba-4ba4-47b2-b0f3-a769e009d161-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 18:10:10 crc kubenswrapper[5008]: I0318 18:10:10.037276 5008 generic.go:334] "Generic (PLEG): container finished" podID="d02d52ba-4ba4-47b2-b0f3-a769e009d161" containerID="6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189" exitCode=0 Mar 18 18:10:10 crc kubenswrapper[5008]: I0318 18:10:10.037329 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" event={"ID":"d02d52ba-4ba4-47b2-b0f3-a769e009d161","Type":"ContainerDied","Data":"6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189"} Mar 18 18:10:10 crc kubenswrapper[5008]: I0318 18:10:10.037364 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" event={"ID":"d02d52ba-4ba4-47b2-b0f3-a769e009d161","Type":"ContainerDied","Data":"61b317bd85bab92a566e47702b9e8701e59d2dfc19c5692c2aa133d9eb047fdc"} Mar 18 18:10:10 crc kubenswrapper[5008]: I0318 18:10:10.037386 5008 scope.go:117] "RemoveContainer" containerID="6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189" Mar 18 18:10:10 crc kubenswrapper[5008]: I0318 18:10:10.037500 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:10:11 crc kubenswrapper[5008]: I0318 18:10:11.157080 5008 scope.go:117] "RemoveContainer" containerID="6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189" Mar 18 18:10:11 crc kubenswrapper[5008]: E0318 18:10:11.157841 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189\": container with ID starting with 6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189 not found: ID does not exist" containerID="6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189" Mar 18 18:10:11 crc kubenswrapper[5008]: I0318 18:10:11.157875 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189"} err="failed to get container status \"6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189\": rpc error: code = NotFound desc = could not find container \"6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189\": container with ID starting with 6e66c8b3b1bd2dd3662cb9abb15f4c295129daa7311236fedc9e0a1a359b6189 not found: ID does not exist" Mar 18 18:10:41 crc kubenswrapper[5008]: I0318 18:10:41.136654 5008 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podd02d52ba-4ba4-47b2-b0f3-a769e009d161"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podd02d52ba-4ba4-47b2-b0f3-a769e009d161] : Timed out while waiting for systemd to remove kubepods-burstable-podd02d52ba_4ba4_47b2_b0f3_a769e009d161.slice" Mar 18 18:10:41 crc kubenswrapper[5008]: E0318 18:10:41.137147 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable podd02d52ba-4ba4-47b2-b0f3-a769e009d161] : unable to destroy cgroup paths for cgroup [kubepods burstable podd02d52ba-4ba4-47b2-b0f3-a769e009d161] : Timed out while waiting for systemd to remove kubepods-burstable-podd02d52ba_4ba4_47b2_b0f3_a769e009d161.slice" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" podUID="d02d52ba-4ba4-47b2-b0f3-a769e009d161" Mar 18 18:10:41 crc kubenswrapper[5008]: I0318 18:10:41.237462 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5gw26" Mar 18 18:10:41 crc kubenswrapper[5008]: I0318 18:10:41.275387 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5gw26"] Mar 18 18:10:41 crc kubenswrapper[5008]: I0318 18:10:41.278240 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5gw26"] Mar 18 18:10:42 crc kubenswrapper[5008]: I0318 18:10:42.205586 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02d52ba-4ba4-47b2-b0f3-a769e009d161" path="/var/lib/kubelet/pods/d02d52ba-4ba4-47b2-b0f3-a769e009d161/volumes" Mar 18 18:11:54 crc kubenswrapper[5008]: I0318 18:11:54.460127 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:11:54 crc kubenswrapper[5008]: I0318 18:11:54.461677 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.149592 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564292-hwhk6"] Mar 18 18:12:00 crc kubenswrapper[5008]: E0318 18:12:00.151046 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469ffaa4-8b51-436c-bdf8-d43dcbff1fcd" containerName="oc" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.151085 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="469ffaa4-8b51-436c-bdf8-d43dcbff1fcd" containerName="oc" Mar 18 18:12:00 crc kubenswrapper[5008]: E0318 18:12:00.151103 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02d52ba-4ba4-47b2-b0f3-a769e009d161" containerName="registry" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.151112 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02d52ba-4ba4-47b2-b0f3-a769e009d161" containerName="registry" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.151250 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="469ffaa4-8b51-436c-bdf8-d43dcbff1fcd" containerName="oc" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.151284 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02d52ba-4ba4-47b2-b0f3-a769e009d161" containerName="registry" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.151808 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-hwhk6" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.153607 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.154172 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.157155 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.157402 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-hwhk6"] Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.185323 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvn88\" (UniqueName: \"kubernetes.io/projected/8a02e271-d56f-49b0-98bf-18b9ac62f364-kube-api-access-cvn88\") pod \"auto-csr-approver-29564292-hwhk6\" (UID: \"8a02e271-d56f-49b0-98bf-18b9ac62f364\") " pod="openshift-infra/auto-csr-approver-29564292-hwhk6" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.288075 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvn88\" (UniqueName: \"kubernetes.io/projected/8a02e271-d56f-49b0-98bf-18b9ac62f364-kube-api-access-cvn88\") pod \"auto-csr-approver-29564292-hwhk6\" (UID: \"8a02e271-d56f-49b0-98bf-18b9ac62f364\") " pod="openshift-infra/auto-csr-approver-29564292-hwhk6" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.311287 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvn88\" (UniqueName: \"kubernetes.io/projected/8a02e271-d56f-49b0-98bf-18b9ac62f364-kube-api-access-cvn88\") pod \"auto-csr-approver-29564292-hwhk6\" (UID: \"8a02e271-d56f-49b0-98bf-18b9ac62f364\") " pod="openshift-infra/auto-csr-approver-29564292-hwhk6" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.474849 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-hwhk6" Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.745426 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-hwhk6"] Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.754757 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:12:00 crc kubenswrapper[5008]: I0318 18:12:00.758620 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-hwhk6" event={"ID":"8a02e271-d56f-49b0-98bf-18b9ac62f364","Type":"ContainerStarted","Data":"50aad9efdd16a594e2b88cf9509fad8e4839be6b036515fac3251230c5d4cd05"} Mar 18 18:12:02 crc kubenswrapper[5008]: I0318 18:12:02.772750 5008 generic.go:334] "Generic (PLEG): container finished" podID="8a02e271-d56f-49b0-98bf-18b9ac62f364" containerID="68d1c1a92676bebaf4b365e611b18c4eb02dea01c14193a91b94bf2cb447f2ae" exitCode=0 Mar 18 18:12:02 crc kubenswrapper[5008]: I0318 18:12:02.772831 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-hwhk6" event={"ID":"8a02e271-d56f-49b0-98bf-18b9ac62f364","Type":"ContainerDied","Data":"68d1c1a92676bebaf4b365e611b18c4eb02dea01c14193a91b94bf2cb447f2ae"} Mar 18 18:12:04 crc kubenswrapper[5008]: I0318 18:12:04.056151 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-hwhk6" Mar 18 18:12:04 crc kubenswrapper[5008]: I0318 18:12:04.160873 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvn88\" (UniqueName: \"kubernetes.io/projected/8a02e271-d56f-49b0-98bf-18b9ac62f364-kube-api-access-cvn88\") pod \"8a02e271-d56f-49b0-98bf-18b9ac62f364\" (UID: \"8a02e271-d56f-49b0-98bf-18b9ac62f364\") " Mar 18 18:12:04 crc kubenswrapper[5008]: I0318 18:12:04.168703 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a02e271-d56f-49b0-98bf-18b9ac62f364-kube-api-access-cvn88" (OuterVolumeSpecName: "kube-api-access-cvn88") pod "8a02e271-d56f-49b0-98bf-18b9ac62f364" (UID: "8a02e271-d56f-49b0-98bf-18b9ac62f364"). InnerVolumeSpecName "kube-api-access-cvn88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:12:04 crc kubenswrapper[5008]: I0318 18:12:04.263163 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvn88\" (UniqueName: \"kubernetes.io/projected/8a02e271-d56f-49b0-98bf-18b9ac62f364-kube-api-access-cvn88\") on node \"crc\" DevicePath \"\"" Mar 18 18:12:04 crc kubenswrapper[5008]: I0318 18:12:04.789611 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564292-hwhk6" event={"ID":"8a02e271-d56f-49b0-98bf-18b9ac62f364","Type":"ContainerDied","Data":"50aad9efdd16a594e2b88cf9509fad8e4839be6b036515fac3251230c5d4cd05"} Mar 18 18:12:04 crc kubenswrapper[5008]: I0318 18:12:04.789674 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50aad9efdd16a594e2b88cf9509fad8e4839be6b036515fac3251230c5d4cd05" Mar 18 18:12:04 crc kubenswrapper[5008]: I0318 18:12:04.789671 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564292-hwhk6" Mar 18 18:12:05 crc kubenswrapper[5008]: I0318 18:12:05.117498 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-cfxqp"] Mar 18 18:12:05 crc kubenswrapper[5008]: I0318 18:12:05.124286 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564286-cfxqp"] Mar 18 18:12:06 crc kubenswrapper[5008]: I0318 18:12:06.208926 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5424805e-15fc-4424-8942-93f7095e148b" path="/var/lib/kubelet/pods/5424805e-15fc-4424-8942-93f7095e148b/volumes" Mar 18 18:12:24 crc kubenswrapper[5008]: I0318 18:12:24.460805 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:12:24 crc kubenswrapper[5008]: I0318 18:12:24.463320 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:12:54 crc kubenswrapper[5008]: I0318 18:12:54.460331 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:12:54 crc kubenswrapper[5008]: I0318 18:12:54.460792 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:12:54 crc kubenswrapper[5008]: I0318 18:12:54.460847 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:12:54 crc kubenswrapper[5008]: I0318 18:12:54.461317 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ccc64188129ed87aa4fe6905ebfe183cf7aa5b27085343f7bce874d89c57b22"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:12:54 crc kubenswrapper[5008]: I0318 18:12:54.461359 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://4ccc64188129ed87aa4fe6905ebfe183cf7aa5b27085343f7bce874d89c57b22" gracePeriod=600 Mar 18 18:12:55 crc kubenswrapper[5008]: I0318 18:12:55.125114 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="4ccc64188129ed87aa4fe6905ebfe183cf7aa5b27085343f7bce874d89c57b22" exitCode=0 Mar 18 18:12:55 crc kubenswrapper[5008]: I0318 18:12:55.125230 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"4ccc64188129ed87aa4fe6905ebfe183cf7aa5b27085343f7bce874d89c57b22"} Mar 18 18:12:55 crc kubenswrapper[5008]: I0318 18:12:55.125477 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"214d6c92536a599ebc37d353c0d760916c05e754a965960c9c93d0c42cd15af6"} Mar 18 18:12:55 crc kubenswrapper[5008]: I0318 18:12:55.125501 5008 scope.go:117] "RemoveContainer" containerID="265c93bd38176e028a3b20d735aef8eb6b45124abbc855b3703820d202fa1f53" Mar 18 18:13:35 crc kubenswrapper[5008]: I0318 18:13:35.991429 5008 scope.go:117] "RemoveContainer" containerID="0f51e59d155f6c23cba548e1ae71e19633e8ada6dc30593a5e7c0d5952f1436b" Mar 18 18:13:36 crc kubenswrapper[5008]: I0318 18:13:36.020600 5008 scope.go:117] "RemoveContainer" containerID="2654336f098a4c27f83fdd77a681cf3086db6e716a1aa17b63985649718fd08c" Mar 18 18:13:36 crc kubenswrapper[5008]: I0318 18:13:36.084253 5008 scope.go:117] "RemoveContainer" containerID="be96d29eb9bb52d2253258f92c3072bdf91e838b5357141b7231eb29b432cf6e" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.152939 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564294-g5m8z"] Mar 18 18:14:00 crc kubenswrapper[5008]: E0318 18:14:00.153596 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a02e271-d56f-49b0-98bf-18b9ac62f364" containerName="oc" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.153608 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a02e271-d56f-49b0-98bf-18b9ac62f364" containerName="oc" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.153697 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a02e271-d56f-49b0-98bf-18b9ac62f364" containerName="oc" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.154024 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-g5m8z" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.158326 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.159030 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.159326 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.171140 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-g5m8z"] Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.280199 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mchdj\" (UniqueName: \"kubernetes.io/projected/424dea02-32aa-4bb7-913c-dc9eec1a265b-kube-api-access-mchdj\") pod \"auto-csr-approver-29564294-g5m8z\" (UID: \"424dea02-32aa-4bb7-913c-dc9eec1a265b\") " pod="openshift-infra/auto-csr-approver-29564294-g5m8z" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.382186 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mchdj\" (UniqueName: \"kubernetes.io/projected/424dea02-32aa-4bb7-913c-dc9eec1a265b-kube-api-access-mchdj\") pod \"auto-csr-approver-29564294-g5m8z\" (UID: \"424dea02-32aa-4bb7-913c-dc9eec1a265b\") " pod="openshift-infra/auto-csr-approver-29564294-g5m8z" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.413325 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mchdj\" (UniqueName: \"kubernetes.io/projected/424dea02-32aa-4bb7-913c-dc9eec1a265b-kube-api-access-mchdj\") pod \"auto-csr-approver-29564294-g5m8z\" (UID: \"424dea02-32aa-4bb7-913c-dc9eec1a265b\") " pod="openshift-infra/auto-csr-approver-29564294-g5m8z" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.483789 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-g5m8z" Mar 18 18:14:00 crc kubenswrapper[5008]: I0318 18:14:00.733723 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-g5m8z"] Mar 18 18:14:01 crc kubenswrapper[5008]: I0318 18:14:01.566230 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564294-g5m8z" event={"ID":"424dea02-32aa-4bb7-913c-dc9eec1a265b","Type":"ContainerStarted","Data":"3658045d9902926a4e8b6d21b68fafa227e4f6d5f241723b4d9b4ceaea73a478"} Mar 18 18:14:02 crc kubenswrapper[5008]: I0318 18:14:02.574197 5008 generic.go:334] "Generic (PLEG): container finished" podID="424dea02-32aa-4bb7-913c-dc9eec1a265b" containerID="9fc3f1598faae046bac8c5b726f1b351063210ed7b800195b88579c2e1a1bc64" exitCode=0 Mar 18 18:14:02 crc kubenswrapper[5008]: I0318 18:14:02.574530 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564294-g5m8z" event={"ID":"424dea02-32aa-4bb7-913c-dc9eec1a265b","Type":"ContainerDied","Data":"9fc3f1598faae046bac8c5b726f1b351063210ed7b800195b88579c2e1a1bc64"} Mar 18 18:14:03 crc kubenswrapper[5008]: I0318 18:14:03.806146 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-g5m8z" Mar 18 18:14:03 crc kubenswrapper[5008]: I0318 18:14:03.825181 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mchdj\" (UniqueName: \"kubernetes.io/projected/424dea02-32aa-4bb7-913c-dc9eec1a265b-kube-api-access-mchdj\") pod \"424dea02-32aa-4bb7-913c-dc9eec1a265b\" (UID: \"424dea02-32aa-4bb7-913c-dc9eec1a265b\") " Mar 18 18:14:03 crc kubenswrapper[5008]: I0318 18:14:03.831596 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424dea02-32aa-4bb7-913c-dc9eec1a265b-kube-api-access-mchdj" (OuterVolumeSpecName: "kube-api-access-mchdj") pod "424dea02-32aa-4bb7-913c-dc9eec1a265b" (UID: "424dea02-32aa-4bb7-913c-dc9eec1a265b"). InnerVolumeSpecName "kube-api-access-mchdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:14:03 crc kubenswrapper[5008]: I0318 18:14:03.926548 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mchdj\" (UniqueName: \"kubernetes.io/projected/424dea02-32aa-4bb7-913c-dc9eec1a265b-kube-api-access-mchdj\") on node \"crc\" DevicePath \"\"" Mar 18 18:14:04 crc kubenswrapper[5008]: I0318 18:14:04.587284 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564294-g5m8z" event={"ID":"424dea02-32aa-4bb7-913c-dc9eec1a265b","Type":"ContainerDied","Data":"3658045d9902926a4e8b6d21b68fafa227e4f6d5f241723b4d9b4ceaea73a478"} Mar 18 18:14:04 crc kubenswrapper[5008]: I0318 18:14:04.587329 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3658045d9902926a4e8b6d21b68fafa227e4f6d5f241723b4d9b4ceaea73a478" Mar 18 18:14:04 crc kubenswrapper[5008]: I0318 18:14:04.587379 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564294-g5m8z" Mar 18 18:14:04 crc kubenswrapper[5008]: I0318 18:14:04.910357 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-l8f4q"] Mar 18 18:14:04 crc kubenswrapper[5008]: I0318 18:14:04.917090 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564288-l8f4q"] Mar 18 18:14:06 crc kubenswrapper[5008]: I0318 18:14:06.205644 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17399326-c5b7-4432-a967-87849a37fc80" path="/var/lib/kubelet/pods/17399326-c5b7-4432-a967-87849a37fc80/volumes" Mar 18 18:14:36 crc kubenswrapper[5008]: I0318 18:14:36.205460 5008 scope.go:117] "RemoveContainer" containerID="3a0973bb6d788209f3f6165f4dd85a1b5820d107f7a5172f083f4787d6d13a46" Mar 18 18:14:54 crc kubenswrapper[5008]: I0318 18:14:54.460189 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:14:54 crc kubenswrapper[5008]: I0318 18:14:54.460833 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.156189 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx"] Mar 18 18:15:00 crc kubenswrapper[5008]: E0318 18:15:00.156965 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424dea02-32aa-4bb7-913c-dc9eec1a265b" containerName="oc" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.156988 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="424dea02-32aa-4bb7-913c-dc9eec1a265b" containerName="oc" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.157172 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="424dea02-32aa-4bb7-913c-dc9eec1a265b" containerName="oc" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.157792 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.160416 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.165252 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.167507 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx"] Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.189033 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cbb1f62-5be6-446b-b529-8d5137ab403d-config-volume\") pod \"collect-profiles-29564295-k4stx\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.190099 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cbb1f62-5be6-446b-b529-8d5137ab403d-secret-volume\") pod \"collect-profiles-29564295-k4stx\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.190360 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqd8\" (UniqueName: \"kubernetes.io/projected/6cbb1f62-5be6-446b-b529-8d5137ab403d-kube-api-access-4lqd8\") pod \"collect-profiles-29564295-k4stx\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.290907 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqd8\" (UniqueName: \"kubernetes.io/projected/6cbb1f62-5be6-446b-b529-8d5137ab403d-kube-api-access-4lqd8\") pod \"collect-profiles-29564295-k4stx\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.291030 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cbb1f62-5be6-446b-b529-8d5137ab403d-config-volume\") pod \"collect-profiles-29564295-k4stx\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.291072 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cbb1f62-5be6-446b-b529-8d5137ab403d-secret-volume\") pod \"collect-profiles-29564295-k4stx\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.293135 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cbb1f62-5be6-446b-b529-8d5137ab403d-config-volume\") pod \"collect-profiles-29564295-k4stx\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.298670 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cbb1f62-5be6-446b-b529-8d5137ab403d-secret-volume\") pod \"collect-profiles-29564295-k4stx\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.311046 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqd8\" (UniqueName: \"kubernetes.io/projected/6cbb1f62-5be6-446b-b529-8d5137ab403d-kube-api-access-4lqd8\") pod \"collect-profiles-29564295-k4stx\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.509694 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.742035 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx"] Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.971370 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" event={"ID":"6cbb1f62-5be6-446b-b529-8d5137ab403d","Type":"ContainerStarted","Data":"2a2feddf140861300acf9920789a4be48f7fd0da7e743a1816618bf2746b0ead"} Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.971437 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" event={"ID":"6cbb1f62-5be6-446b-b529-8d5137ab403d","Type":"ContainerStarted","Data":"ca82653ae65bc603d621fae9875c4862d02ae02713dd03686155d125b5ef6d63"} Mar 18 18:15:00 crc kubenswrapper[5008]: I0318 18:15:00.994387 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" podStartSLOduration=0.994365681 podStartE2EDuration="994.365681ms" podCreationTimestamp="2026-03-18 18:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:15:00.990413657 +0000 UTC m=+757.509886766" watchObservedRunningTime="2026-03-18 18:15:00.994365681 +0000 UTC m=+757.513838770" Mar 18 18:15:01 crc kubenswrapper[5008]: I0318 18:15:01.981214 5008 generic.go:334] "Generic (PLEG): container finished" podID="6cbb1f62-5be6-446b-b529-8d5137ab403d" containerID="2a2feddf140861300acf9920789a4be48f7fd0da7e743a1816618bf2746b0ead" exitCode=0 Mar 18 18:15:01 crc kubenswrapper[5008]: I0318 18:15:01.981353 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" event={"ID":"6cbb1f62-5be6-446b-b529-8d5137ab403d","Type":"ContainerDied","Data":"2a2feddf140861300acf9920789a4be48f7fd0da7e743a1816618bf2746b0ead"} Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.286007 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.435470 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cbb1f62-5be6-446b-b529-8d5137ab403d-secret-volume\") pod \"6cbb1f62-5be6-446b-b529-8d5137ab403d\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.435694 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cbb1f62-5be6-446b-b529-8d5137ab403d-config-volume\") pod \"6cbb1f62-5be6-446b-b529-8d5137ab403d\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.435834 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lqd8\" (UniqueName: \"kubernetes.io/projected/6cbb1f62-5be6-446b-b529-8d5137ab403d-kube-api-access-4lqd8\") pod \"6cbb1f62-5be6-446b-b529-8d5137ab403d\" (UID: \"6cbb1f62-5be6-446b-b529-8d5137ab403d\") " Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.436275 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cbb1f62-5be6-446b-b529-8d5137ab403d-config-volume" (OuterVolumeSpecName: "config-volume") pod "6cbb1f62-5be6-446b-b529-8d5137ab403d" (UID: "6cbb1f62-5be6-446b-b529-8d5137ab403d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.444116 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbb1f62-5be6-446b-b529-8d5137ab403d-kube-api-access-4lqd8" (OuterVolumeSpecName: "kube-api-access-4lqd8") pod "6cbb1f62-5be6-446b-b529-8d5137ab403d" (UID: "6cbb1f62-5be6-446b-b529-8d5137ab403d"). InnerVolumeSpecName "kube-api-access-4lqd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.444166 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cbb1f62-5be6-446b-b529-8d5137ab403d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6cbb1f62-5be6-446b-b529-8d5137ab403d" (UID: "6cbb1f62-5be6-446b-b529-8d5137ab403d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.536795 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6cbb1f62-5be6-446b-b529-8d5137ab403d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.536823 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6cbb1f62-5be6-446b-b529-8d5137ab403d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.536833 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lqd8\" (UniqueName: \"kubernetes.io/projected/6cbb1f62-5be6-446b-b529-8d5137ab403d-kube-api-access-4lqd8\") on node \"crc\" DevicePath \"\"" Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.993609 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" event={"ID":"6cbb1f62-5be6-446b-b529-8d5137ab403d","Type":"ContainerDied","Data":"ca82653ae65bc603d621fae9875c4862d02ae02713dd03686155d125b5ef6d63"} Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.993969 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca82653ae65bc603d621fae9875c4862d02ae02713dd03686155d125b5ef6d63" Mar 18 18:15:03 crc kubenswrapper[5008]: I0318 18:15:03.993662 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx" Mar 18 18:15:24 crc kubenswrapper[5008]: I0318 18:15:24.461050 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:15:24 crc kubenswrapper[5008]: I0318 18:15:24.461736 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:15:54 crc kubenswrapper[5008]: I0318 18:15:54.460533 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:15:54 crc kubenswrapper[5008]: I0318 18:15:54.461425 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:15:54 crc kubenswrapper[5008]: I0318 18:15:54.461538 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:15:54 crc kubenswrapper[5008]: I0318 18:15:54.462494 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"214d6c92536a599ebc37d353c0d760916c05e754a965960c9c93d0c42cd15af6"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:15:54 crc kubenswrapper[5008]: I0318 18:15:54.462632 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://214d6c92536a599ebc37d353c0d760916c05e754a965960c9c93d0c42cd15af6" gracePeriod=600 Mar 18 18:15:55 crc kubenswrapper[5008]: I0318 18:15:55.355818 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="214d6c92536a599ebc37d353c0d760916c05e754a965960c9c93d0c42cd15af6" exitCode=0 Mar 18 18:15:55 crc kubenswrapper[5008]: I0318 18:15:55.355868 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"214d6c92536a599ebc37d353c0d760916c05e754a965960c9c93d0c42cd15af6"} Mar 18 18:15:55 crc kubenswrapper[5008]: I0318 18:15:55.356233 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"f98223e188c7e180bb9c16b9b888a18eaae99967d91bf2ff048b12e80fd84a1c"} Mar 18 18:15:55 crc kubenswrapper[5008]: I0318 18:15:55.356256 5008 scope.go:117] "RemoveContainer" containerID="4ccc64188129ed87aa4fe6905ebfe183cf7aa5b27085343f7bce874d89c57b22" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.150218 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564296-8jv4l"] Mar 18 18:16:00 crc kubenswrapper[5008]: E0318 18:16:00.151757 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbb1f62-5be6-446b-b529-8d5137ab403d" containerName="collect-profiles" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.151782 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbb1f62-5be6-446b-b529-8d5137ab403d" containerName="collect-profiles" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.151971 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbb1f62-5be6-446b-b529-8d5137ab403d" containerName="collect-profiles" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.152745 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-8jv4l" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.155603 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.155880 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.156473 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.167840 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-8jv4l"] Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.313213 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5tl\" (UniqueName: \"kubernetes.io/projected/25c5dcfb-6da1-4284-9d73-08a9e08a3e85-kube-api-access-8v5tl\") pod \"auto-csr-approver-29564296-8jv4l\" (UID: \"25c5dcfb-6da1-4284-9d73-08a9e08a3e85\") " pod="openshift-infra/auto-csr-approver-29564296-8jv4l" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.414337 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5tl\" (UniqueName: \"kubernetes.io/projected/25c5dcfb-6da1-4284-9d73-08a9e08a3e85-kube-api-access-8v5tl\") pod \"auto-csr-approver-29564296-8jv4l\" (UID: \"25c5dcfb-6da1-4284-9d73-08a9e08a3e85\") " pod="openshift-infra/auto-csr-approver-29564296-8jv4l" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.435919 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5tl\" (UniqueName: \"kubernetes.io/projected/25c5dcfb-6da1-4284-9d73-08a9e08a3e85-kube-api-access-8v5tl\") pod \"auto-csr-approver-29564296-8jv4l\" (UID: \"25c5dcfb-6da1-4284-9d73-08a9e08a3e85\") " pod="openshift-infra/auto-csr-approver-29564296-8jv4l" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.479701 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-8jv4l" Mar 18 18:16:00 crc kubenswrapper[5008]: I0318 18:16:00.726195 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-8jv4l"] Mar 18 18:16:01 crc kubenswrapper[5008]: I0318 18:16:01.298531 5008 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 18:16:01 crc kubenswrapper[5008]: I0318 18:16:01.394748 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-8jv4l" event={"ID":"25c5dcfb-6da1-4284-9d73-08a9e08a3e85","Type":"ContainerStarted","Data":"ee55d98070075a73b5b2a7bf8febedf70e680b632a5ec3047cddf3560cfd2221"} Mar 18 18:16:03 crc kubenswrapper[5008]: I0318 18:16:03.409760 5008 generic.go:334] "Generic (PLEG): container finished" podID="25c5dcfb-6da1-4284-9d73-08a9e08a3e85" containerID="9d7e5ba332e27f2ccb008910b992f9726d91d3d9540fae35b5488a12d7d5acb1" exitCode=0 Mar 18 18:16:03 crc kubenswrapper[5008]: I0318 18:16:03.409834 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-8jv4l" event={"ID":"25c5dcfb-6da1-4284-9d73-08a9e08a3e85","Type":"ContainerDied","Data":"9d7e5ba332e27f2ccb008910b992f9726d91d3d9540fae35b5488a12d7d5acb1"} Mar 18 18:16:04 crc kubenswrapper[5008]: I0318 18:16:04.683694 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-8jv4l" Mar 18 18:16:05 crc kubenswrapper[5008]: I0318 18:16:05.025272 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v5tl\" (UniqueName: \"kubernetes.io/projected/25c5dcfb-6da1-4284-9d73-08a9e08a3e85-kube-api-access-8v5tl\") pod \"25c5dcfb-6da1-4284-9d73-08a9e08a3e85\" (UID: \"25c5dcfb-6da1-4284-9d73-08a9e08a3e85\") " Mar 18 18:16:05 crc kubenswrapper[5008]: I0318 18:16:05.041306 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c5dcfb-6da1-4284-9d73-08a9e08a3e85-kube-api-access-8v5tl" (OuterVolumeSpecName: "kube-api-access-8v5tl") pod "25c5dcfb-6da1-4284-9d73-08a9e08a3e85" (UID: "25c5dcfb-6da1-4284-9d73-08a9e08a3e85"). InnerVolumeSpecName "kube-api-access-8v5tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:05 crc kubenswrapper[5008]: I0318 18:16:05.127827 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v5tl\" (UniqueName: \"kubernetes.io/projected/25c5dcfb-6da1-4284-9d73-08a9e08a3e85-kube-api-access-8v5tl\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:05 crc kubenswrapper[5008]: I0318 18:16:05.424987 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564296-8jv4l" event={"ID":"25c5dcfb-6da1-4284-9d73-08a9e08a3e85","Type":"ContainerDied","Data":"ee55d98070075a73b5b2a7bf8febedf70e680b632a5ec3047cddf3560cfd2221"} Mar 18 18:16:05 crc kubenswrapper[5008]: I0318 18:16:05.425034 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee55d98070075a73b5b2a7bf8febedf70e680b632a5ec3047cddf3560cfd2221" Mar 18 18:16:05 crc kubenswrapper[5008]: I0318 18:16:05.425080 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564296-8jv4l" Mar 18 18:16:06 crc kubenswrapper[5008]: I0318 18:16:06.083521 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-8znfx"] Mar 18 18:16:06 crc kubenswrapper[5008]: I0318 18:16:06.091785 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564290-8znfx"] Mar 18 18:16:06 crc kubenswrapper[5008]: I0318 18:16:06.220092 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="469ffaa4-8b51-436c-bdf8-d43dcbff1fcd" path="/var/lib/kubelet/pods/469ffaa4-8b51-436c-bdf8-d43dcbff1fcd/volumes" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.142042 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-92gtg"] Mar 18 18:16:08 crc kubenswrapper[5008]: E0318 18:16:08.142280 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c5dcfb-6da1-4284-9d73-08a9e08a3e85" containerName="oc" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.142293 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c5dcfb-6da1-4284-9d73-08a9e08a3e85" containerName="oc" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.142398 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c5dcfb-6da1-4284-9d73-08a9e08a3e85" containerName="oc" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.143053 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.163673 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92gtg"] Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.165756 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-utilities\") pod \"redhat-operators-92gtg\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.165951 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-catalog-content\") pod \"redhat-operators-92gtg\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.166090 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46q9\" (UniqueName: \"kubernetes.io/projected/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-kube-api-access-r46q9\") pod \"redhat-operators-92gtg\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.266994 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-catalog-content\") pod \"redhat-operators-92gtg\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.267086 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r46q9\" (UniqueName: \"kubernetes.io/projected/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-kube-api-access-r46q9\") pod \"redhat-operators-92gtg\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.267135 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-utilities\") pod \"redhat-operators-92gtg\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.267505 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-catalog-content\") pod \"redhat-operators-92gtg\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.268434 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-utilities\") pod \"redhat-operators-92gtg\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.301984 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r46q9\" (UniqueName: \"kubernetes.io/projected/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-kube-api-access-r46q9\") pod \"redhat-operators-92gtg\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.468134 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:08 crc kubenswrapper[5008]: I0318 18:16:08.885573 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92gtg"] Mar 18 18:16:08 crc kubenswrapper[5008]: W0318 18:16:08.896851 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac3c361_c4f3_43dc_9c4a_dd1b56ba0967.slice/crio-954df93be977bdff01c4d76b2d087ce172984c26c65f53855061e826b84c49a5 WatchSource:0}: Error finding container 954df93be977bdff01c4d76b2d087ce172984c26c65f53855061e826b84c49a5: Status 404 returned error can't find the container with id 954df93be977bdff01c4d76b2d087ce172984c26c65f53855061e826b84c49a5 Mar 18 18:16:09 crc kubenswrapper[5008]: I0318 18:16:09.450095 5008 generic.go:334] "Generic (PLEG): container finished" podID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerID="0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc" exitCode=0 Mar 18 18:16:09 crc kubenswrapper[5008]: I0318 18:16:09.450181 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92gtg" event={"ID":"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967","Type":"ContainerDied","Data":"0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc"} Mar 18 18:16:09 crc kubenswrapper[5008]: I0318 18:16:09.450338 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92gtg" event={"ID":"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967","Type":"ContainerStarted","Data":"954df93be977bdff01c4d76b2d087ce172984c26c65f53855061e826b84c49a5"} Mar 18 18:16:11 crc kubenswrapper[5008]: I0318 18:16:11.465678 5008 generic.go:334] "Generic (PLEG): container finished" podID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerID="3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f" exitCode=0 Mar 18 18:16:11 crc kubenswrapper[5008]: I0318 18:16:11.465728 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92gtg" event={"ID":"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967","Type":"ContainerDied","Data":"3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f"} Mar 18 18:16:11 crc kubenswrapper[5008]: I0318 18:16:11.732032 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sl5k5"] Mar 18 18:16:11 crc kubenswrapper[5008]: I0318 18:16:11.733898 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:11 crc kubenswrapper[5008]: I0318 18:16:11.781600 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sl5k5"] Mar 18 18:16:11 crc kubenswrapper[5008]: I0318 18:16:11.907847 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbq9r\" (UniqueName: \"kubernetes.io/projected/4d945b17-5c72-42b2-84ce-137447fdb572-kube-api-access-wbq9r\") pod \"community-operators-sl5k5\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:11 crc kubenswrapper[5008]: I0318 18:16:11.907898 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-utilities\") pod \"community-operators-sl5k5\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:11 crc kubenswrapper[5008]: I0318 18:16:11.907957 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-catalog-content\") pod \"community-operators-sl5k5\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.009990 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbq9r\" (UniqueName: \"kubernetes.io/projected/4d945b17-5c72-42b2-84ce-137447fdb572-kube-api-access-wbq9r\") pod \"community-operators-sl5k5\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.010050 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-utilities\") pod \"community-operators-sl5k5\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.010081 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-catalog-content\") pod \"community-operators-sl5k5\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.011047 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-catalog-content\") pod \"community-operators-sl5k5\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.011113 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-utilities\") pod \"community-operators-sl5k5\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.034753 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbq9r\" (UniqueName: \"kubernetes.io/projected/4d945b17-5c72-42b2-84ce-137447fdb572-kube-api-access-wbq9r\") pod \"community-operators-sl5k5\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.094818 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.354040 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sl5k5"] Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.472501 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl5k5" event={"ID":"4d945b17-5c72-42b2-84ce-137447fdb572","Type":"ContainerStarted","Data":"ba148802b49cab30ce134ec246705e73372637c4063e48e9230cda3d12cce2eb"} Mar 18 18:16:12 crc kubenswrapper[5008]: I0318 18:16:12.475074 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92gtg" event={"ID":"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967","Type":"ContainerStarted","Data":"0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e"} Mar 18 18:16:13 crc kubenswrapper[5008]: I0318 18:16:13.483114 5008 generic.go:334] "Generic (PLEG): container finished" podID="4d945b17-5c72-42b2-84ce-137447fdb572" containerID="b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc" exitCode=0 Mar 18 18:16:13 crc kubenswrapper[5008]: I0318 18:16:13.483211 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl5k5" event={"ID":"4d945b17-5c72-42b2-84ce-137447fdb572","Type":"ContainerDied","Data":"b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc"} Mar 18 18:16:13 crc kubenswrapper[5008]: I0318 18:16:13.515585 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-92gtg" podStartSLOduration=3.025739222 podStartE2EDuration="5.515517551s" podCreationTimestamp="2026-03-18 18:16:08 +0000 UTC" firstStartedPulling="2026-03-18 18:16:09.451453398 +0000 UTC m=+825.970926477" lastFinishedPulling="2026-03-18 18:16:11.941231717 +0000 UTC m=+828.460704806" observedRunningTime="2026-03-18 18:16:12.491142933 +0000 UTC m=+829.010616032" watchObservedRunningTime="2026-03-18 18:16:13.515517551 +0000 UTC m=+830.034990660" Mar 18 18:16:14 crc kubenswrapper[5008]: I0318 18:16:14.495405 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl5k5" event={"ID":"4d945b17-5c72-42b2-84ce-137447fdb572","Type":"ContainerStarted","Data":"8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8"} Mar 18 18:16:14 crc kubenswrapper[5008]: E0318 18:16:14.626858 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d945b17_5c72_42b2_84ce_137447fdb572.slice/crio-conmon-8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8.scope\": RecentStats: unable to find data in memory cache]" Mar 18 18:16:15 crc kubenswrapper[5008]: I0318 18:16:15.505622 5008 generic.go:334] "Generic (PLEG): container finished" podID="4d945b17-5c72-42b2-84ce-137447fdb572" containerID="8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8" exitCode=0 Mar 18 18:16:15 crc kubenswrapper[5008]: I0318 18:16:15.505689 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl5k5" event={"ID":"4d945b17-5c72-42b2-84ce-137447fdb572","Type":"ContainerDied","Data":"8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8"} Mar 18 18:16:16 crc kubenswrapper[5008]: I0318 18:16:16.511801 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl5k5" event={"ID":"4d945b17-5c72-42b2-84ce-137447fdb572","Type":"ContainerStarted","Data":"eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb"} Mar 18 18:16:16 crc kubenswrapper[5008]: I0318 18:16:16.532524 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sl5k5" podStartSLOduration=2.861311954 podStartE2EDuration="5.532500688s" podCreationTimestamp="2026-03-18 18:16:11 +0000 UTC" firstStartedPulling="2026-03-18 18:16:13.486525539 +0000 UTC m=+830.005998658" lastFinishedPulling="2026-03-18 18:16:16.157714293 +0000 UTC m=+832.677187392" observedRunningTime="2026-03-18 18:16:16.529410102 +0000 UTC m=+833.048883191" watchObservedRunningTime="2026-03-18 18:16:16.532500688 +0000 UTC m=+833.051973777" Mar 18 18:16:18 crc kubenswrapper[5008]: I0318 18:16:18.469037 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:18 crc kubenswrapper[5008]: I0318 18:16:18.469516 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:19 crc kubenswrapper[5008]: I0318 18:16:19.522266 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-92gtg" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerName="registry-server" probeResult="failure" output=< Mar 18 18:16:19 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 18:16:19 crc kubenswrapper[5008]: > Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.515461 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5278w"] Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.516985 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="nbdb" containerID="cri-o://466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a" gracePeriod=30 Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.517111 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovn-acl-logging" containerID="cri-o://9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f" gracePeriod=30 Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.516976 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovn-controller" containerID="cri-o://e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f" gracePeriod=30 Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.517107 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kube-rbac-proxy-node" containerID="cri-o://ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a" gracePeriod=30 Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.517185 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="sbdb" containerID="cri-o://b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff" gracePeriod=30 Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.517121 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="northd" containerID="cri-o://65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b" gracePeriod=30 Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.518002 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd" gracePeriod=30 Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.591819 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" containerID="cri-o://568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78" gracePeriod=30 Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.862518 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/3.log" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.864901 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovn-acl-logging/0.log" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.865428 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovn-controller/0.log" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.866050 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.922812 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4twbz"] Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923024 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="sbdb" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923037 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="sbdb" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923043 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923049 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923059 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovn-acl-logging" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923065 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovn-acl-logging" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923074 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923079 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923086 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="nbdb" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923091 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="nbdb" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923100 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovn-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923106 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovn-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923116 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="northd" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923121 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="northd" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923129 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923135 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923144 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kube-rbac-proxy-node" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923150 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kube-rbac-proxy-node" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923157 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923163 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923171 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923177 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923184 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kubecfg-setup" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923191 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kubecfg-setup" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923284 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="sbdb" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923295 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923303 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923311 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovn-acl-logging" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923319 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923326 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923333 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="northd" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923340 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="nbdb" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923346 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923353 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923360 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="kube-rbac-proxy-node" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923368 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovn-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: E0318 18:16:20.923451 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.923458 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerName="ovnkube-controller" Mar 18 18:16:20 crc kubenswrapper[5008]: I0318 18:16:20.925021 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034685 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-script-lib\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034727 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-netd\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034772 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-node-log\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034784 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-openvswitch\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034803 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-systemd\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034819 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-env-overrides\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034829 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034835 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-node-log" (OuterVolumeSpecName: "node-log") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034845 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-bin\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034860 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-netns\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034874 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovn-node-metrics-cert\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034894 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-etc-openvswitch\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034911 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-slash\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034924 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-var-lib-openvswitch\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034915 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034942 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29hqn\" (UniqueName: \"kubernetes.io/projected/b105c010-f5cb-41ae-bdff-62bc05da91a1-kube-api-access-29hqn\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034983 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.034973 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035032 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-systemd-units\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035056 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-slash" (OuterVolumeSpecName: "host-slash") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035059 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035073 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-config\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035100 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035118 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035169 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035184 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-ovn\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035223 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035262 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035267 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-log-socket\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035297 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-log-socket" (OuterVolumeSpecName: "log-socket") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035333 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-ovn-kubernetes\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035337 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035366 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035381 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-kubelet\") pod \"b105c010-f5cb-41ae-bdff-62bc05da91a1\" (UID: \"b105c010-f5cb-41ae-bdff-62bc05da91a1\") " Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035407 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035716 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035731 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035749 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-log-socket\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035796 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-run-openvswitch\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035832 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-kubelet\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035866 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035897 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0128d806-9156-4017-ac52-1f5f009d6999-env-overrides\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035929 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-run-ovn\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.035974 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-slash\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036014 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-cni-bin\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036101 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n84tf\" (UniqueName: \"kubernetes.io/projected/0128d806-9156-4017-ac52-1f5f009d6999-kube-api-access-n84tf\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036147 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-etc-openvswitch\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036235 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-run-netns\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036286 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-node-log\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036560 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-systemd-units\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036615 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-var-lib-openvswitch\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036647 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-run-ovn-kubernetes\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036674 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-cni-netd\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036702 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0128d806-9156-4017-ac52-1f5f009d6999-ovnkube-script-lib\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036783 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0128d806-9156-4017-ac52-1f5f009d6999-ovn-node-metrics-cert\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036838 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-run-systemd\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036870 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0128d806-9156-4017-ac52-1f5f009d6999-ovnkube-config\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.036969 5008 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037000 5008 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037025 5008 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037042 5008 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037060 5008 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037076 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037095 5008 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037113 5008 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037129 5008 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037146 5008 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037165 5008 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037180 5008 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037196 5008 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037213 5008 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037231 5008 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037248 5008 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.037264 5008 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.040826 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.041085 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b105c010-f5cb-41ae-bdff-62bc05da91a1-kube-api-access-29hqn" (OuterVolumeSpecName: "kube-api-access-29hqn") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "kube-api-access-29hqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.058960 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b105c010-f5cb-41ae-bdff-62bc05da91a1" (UID: "b105c010-f5cb-41ae-bdff-62bc05da91a1"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138244 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-log-socket\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138310 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-run-openvswitch\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138330 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-kubelet\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138345 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138360 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0128d806-9156-4017-ac52-1f5f009d6999-env-overrides\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138377 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-run-ovn\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138394 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-slash\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138413 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-cni-bin\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138432 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n84tf\" (UniqueName: \"kubernetes.io/projected/0128d806-9156-4017-ac52-1f5f009d6999-kube-api-access-n84tf\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138446 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-etc-openvswitch\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138447 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138469 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-run-netns\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138455 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-run-openvswitch\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138486 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-kubelet\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138517 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-node-log\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138528 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-slash\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138562 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-systemd-units\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138551 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-cni-bin\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138494 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-run-netns\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138601 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-run-ovn\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138604 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-etc-openvswitch\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138617 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-node-log\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138635 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-systemd-units\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138654 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-var-lib-openvswitch\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138670 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-run-ovn-kubernetes\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138685 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-cni-netd\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138698 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0128d806-9156-4017-ac52-1f5f009d6999-ovnkube-script-lib\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138705 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-run-ovn-kubernetes\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138715 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0128d806-9156-4017-ac52-1f5f009d6999-ovn-node-metrics-cert\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138731 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-run-systemd\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138736 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-host-cni-netd\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138744 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0128d806-9156-4017-ac52-1f5f009d6999-ovnkube-config\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138851 5008 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b105c010-f5cb-41ae-bdff-62bc05da91a1-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138867 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b105c010-f5cb-41ae-bdff-62bc05da91a1-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138880 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29hqn\" (UniqueName: \"kubernetes.io/projected/b105c010-f5cb-41ae-bdff-62bc05da91a1-kube-api-access-29hqn\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.138681 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-var-lib-openvswitch\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.139431 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-run-systemd\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.139432 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0128d806-9156-4017-ac52-1f5f009d6999-env-overrides\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.139451 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0128d806-9156-4017-ac52-1f5f009d6999-ovnkube-config\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.139492 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0128d806-9156-4017-ac52-1f5f009d6999-log-socket\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.140159 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0128d806-9156-4017-ac52-1f5f009d6999-ovnkube-script-lib\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.152265 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0128d806-9156-4017-ac52-1f5f009d6999-ovn-node-metrics-cert\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.158691 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n84tf\" (UniqueName: \"kubernetes.io/projected/0128d806-9156-4017-ac52-1f5f009d6999-kube-api-access-n84tf\") pod \"ovnkube-node-4twbz\" (UID: \"0128d806-9156-4017-ac52-1f5f009d6999\") " pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.245431 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.559293 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovnkube-controller/3.log" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.562187 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovn-acl-logging/0.log" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.562856 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5278w_b105c010-f5cb-41ae-bdff-62bc05da91a1/ovn-controller/0.log" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563211 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78" exitCode=0 Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563248 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff" exitCode=0 Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563262 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a" exitCode=0 Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563275 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b" exitCode=0 Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563285 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd" exitCode=0 Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563296 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a" exitCode=0 Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563308 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f" exitCode=143 Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563320 5008 generic.go:334] "Generic (PLEG): container finished" podID="b105c010-f5cb-41ae-bdff-62bc05da91a1" containerID="e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f" exitCode=143 Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563338 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563336 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563449 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563470 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563473 5008 scope.go:117] "RemoveContainer" containerID="568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563488 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563505 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563523 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563540 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563574 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563586 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563595 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563603 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563612 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563621 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563630 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563639 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563653 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563669 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563680 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563691 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563702 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563712 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563721 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563731 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563740 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563748 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563757 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563769 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563783 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563793 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563802 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563811 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563820 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563828 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563837 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563846 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563854 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563863 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563875 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5278w" event={"ID":"b105c010-f5cb-41ae-bdff-62bc05da91a1","Type":"ContainerDied","Data":"0444e493c681e3b62a7e6d7372ec2335c13d32de42341be333ec7f12ccc56662"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563889 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563900 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563908 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563917 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563925 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563933 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563942 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563950 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563959 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.563969 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.567121 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/2.log" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.567707 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/1.log" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.567752 5008 generic.go:334] "Generic (PLEG): container finished" podID="9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd" containerID="9ed6a37c676aa71b3c726690ca391ca75c10ba6ed602041cdcebcfdc7e15ea9e" exitCode=2 Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.567824 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgv8s" event={"ID":"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd","Type":"ContainerDied","Data":"9ed6a37c676aa71b3c726690ca391ca75c10ba6ed602041cdcebcfdc7e15ea9e"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.567905 5008 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.568505 5008 scope.go:117] "RemoveContainer" containerID="9ed6a37c676aa71b3c726690ca391ca75c10ba6ed602041cdcebcfdc7e15ea9e" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.579512 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"4280aa802066a7cb769ee78356cd1be6476b065a18430ed3de2817b85b2d1f1d"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.579583 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"66f922dcde3519482ca2cb56fed57333fe0509c18aaef53e9eb9d36add296611"} Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.588134 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.616994 5008 scope.go:117] "RemoveContainer" containerID="b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.645245 5008 scope.go:117] "RemoveContainer" containerID="466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.652073 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5278w"] Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.658309 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5278w"] Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.667002 5008 scope.go:117] "RemoveContainer" containerID="65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.721165 5008 scope.go:117] "RemoveContainer" containerID="ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.738664 5008 scope.go:117] "RemoveContainer" containerID="ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.760000 5008 scope.go:117] "RemoveContainer" containerID="9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.790196 5008 scope.go:117] "RemoveContainer" containerID="e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.811419 5008 scope.go:117] "RemoveContainer" containerID="69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.839208 5008 scope.go:117] "RemoveContainer" containerID="568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.841264 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": container with ID starting with 568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78 not found: ID does not exist" containerID="568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.841493 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} err="failed to get container status \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": rpc error: code = NotFound desc = could not find container \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": container with ID starting with 568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78 not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.841651 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.842777 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\": container with ID starting with e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9 not found: ID does not exist" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.842866 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} err="failed to get container status \"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\": rpc error: code = NotFound desc = could not find container \"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\": container with ID starting with e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9 not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.842911 5008 scope.go:117] "RemoveContainer" containerID="b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.844035 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\": container with ID starting with b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff not found: ID does not exist" containerID="b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.844182 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} err="failed to get container status \"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\": rpc error: code = NotFound desc = could not find container \"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\": container with ID starting with b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.844333 5008 scope.go:117] "RemoveContainer" containerID="466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.846855 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\": container with ID starting with 466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a not found: ID does not exist" containerID="466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.846911 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} err="failed to get container status \"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\": rpc error: code = NotFound desc = could not find container \"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\": container with ID starting with 466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.846932 5008 scope.go:117] "RemoveContainer" containerID="65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.847330 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\": container with ID starting with 65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b not found: ID does not exist" containerID="65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.847368 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} err="failed to get container status \"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\": rpc error: code = NotFound desc = could not find container \"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\": container with ID starting with 65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.847394 5008 scope.go:117] "RemoveContainer" containerID="ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.847874 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\": container with ID starting with ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd not found: ID does not exist" containerID="ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.847947 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} err="failed to get container status \"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\": rpc error: code = NotFound desc = could not find container \"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\": container with ID starting with ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.847989 5008 scope.go:117] "RemoveContainer" containerID="ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.848319 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\": container with ID starting with ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a not found: ID does not exist" containerID="ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.848439 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} err="failed to get container status \"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\": rpc error: code = NotFound desc = could not find container \"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\": container with ID starting with ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.848464 5008 scope.go:117] "RemoveContainer" containerID="9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.848792 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\": container with ID starting with 9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f not found: ID does not exist" containerID="9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.848843 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} err="failed to get container status \"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\": rpc error: code = NotFound desc = could not find container \"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\": container with ID starting with 9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.848861 5008 scope.go:117] "RemoveContainer" containerID="e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.849266 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\": container with ID starting with e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f not found: ID does not exist" containerID="e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.849291 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} err="failed to get container status \"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\": rpc error: code = NotFound desc = could not find container \"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\": container with ID starting with e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.849307 5008 scope.go:117] "RemoveContainer" containerID="69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a" Mar 18 18:16:21 crc kubenswrapper[5008]: E0318 18:16:21.849609 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\": container with ID starting with 69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a not found: ID does not exist" containerID="69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.849633 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a"} err="failed to get container status \"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\": rpc error: code = NotFound desc = could not find container \"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\": container with ID starting with 69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.849671 5008 scope.go:117] "RemoveContainer" containerID="568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.849916 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} err="failed to get container status \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": rpc error: code = NotFound desc = could not find container \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": container with ID starting with 568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78 not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.849938 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.850195 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} err="failed to get container status \"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\": rpc error: code = NotFound desc = could not find container \"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\": container with ID starting with e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9 not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.850216 5008 scope.go:117] "RemoveContainer" containerID="b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.850464 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} err="failed to get container status \"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\": rpc error: code = NotFound desc = could not find container \"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\": container with ID starting with b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.850506 5008 scope.go:117] "RemoveContainer" containerID="466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.850753 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} err="failed to get container status \"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\": rpc error: code = NotFound desc = could not find container \"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\": container with ID starting with 466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.850776 5008 scope.go:117] "RemoveContainer" containerID="65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.851012 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} err="failed to get container status \"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\": rpc error: code = NotFound desc = could not find container \"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\": container with ID starting with 65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.851034 5008 scope.go:117] "RemoveContainer" containerID="ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.851259 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} err="failed to get container status \"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\": rpc error: code = NotFound desc = could not find container \"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\": container with ID starting with ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.851284 5008 scope.go:117] "RemoveContainer" containerID="ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.851508 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} err="failed to get container status \"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\": rpc error: code = NotFound desc = could not find container \"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\": container with ID starting with ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.851528 5008 scope.go:117] "RemoveContainer" containerID="9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.851771 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} err="failed to get container status \"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\": rpc error: code = NotFound desc = could not find container \"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\": container with ID starting with 9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.851794 5008 scope.go:117] "RemoveContainer" containerID="e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.852071 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} err="failed to get container status \"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\": rpc error: code = NotFound desc = could not find container \"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\": container with ID starting with e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.852094 5008 scope.go:117] "RemoveContainer" containerID="69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.852334 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a"} err="failed to get container status \"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\": rpc error: code = NotFound desc = could not find container \"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\": container with ID starting with 69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.852357 5008 scope.go:117] "RemoveContainer" containerID="568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.852619 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} err="failed to get container status \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": rpc error: code = NotFound desc = could not find container \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": container with ID starting with 568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78 not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.852649 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.852871 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} err="failed to get container status \"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\": rpc error: code = NotFound desc = could not find container \"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\": container with ID starting with e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9 not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.852897 5008 scope.go:117] "RemoveContainer" containerID="b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.853235 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} err="failed to get container status \"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\": rpc error: code = NotFound desc = could not find container \"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\": container with ID starting with b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.853263 5008 scope.go:117] "RemoveContainer" containerID="466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.853514 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} err="failed to get container status \"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\": rpc error: code = NotFound desc = could not find container \"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\": container with ID starting with 466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.853586 5008 scope.go:117] "RemoveContainer" containerID="65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.853840 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} err="failed to get container status \"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\": rpc error: code = NotFound desc = could not find container \"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\": container with ID starting with 65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.853887 5008 scope.go:117] "RemoveContainer" containerID="ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.854100 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} err="failed to get container status \"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\": rpc error: code = NotFound desc = could not find container \"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\": container with ID starting with ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.854148 5008 scope.go:117] "RemoveContainer" containerID="ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.854594 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} err="failed to get container status \"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\": rpc error: code = NotFound desc = could not find container \"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\": container with ID starting with ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.854618 5008 scope.go:117] "RemoveContainer" containerID="9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.854880 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} err="failed to get container status \"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\": rpc error: code = NotFound desc = could not find container \"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\": container with ID starting with 9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.854929 5008 scope.go:117] "RemoveContainer" containerID="e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.855195 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} err="failed to get container status \"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\": rpc error: code = NotFound desc = could not find container \"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\": container with ID starting with e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.855274 5008 scope.go:117] "RemoveContainer" containerID="69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.855504 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a"} err="failed to get container status \"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\": rpc error: code = NotFound desc = could not find container \"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\": container with ID starting with 69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.855523 5008 scope.go:117] "RemoveContainer" containerID="568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.855997 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} err="failed to get container status \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": rpc error: code = NotFound desc = could not find container \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": container with ID starting with 568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78 not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.856048 5008 scope.go:117] "RemoveContainer" containerID="e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.856364 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9"} err="failed to get container status \"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\": rpc error: code = NotFound desc = could not find container \"e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9\": container with ID starting with e720fa88ed4d009918b5f8f39b66449f9ce3c4155411e196d7e01ce96af434a9 not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.856392 5008 scope.go:117] "RemoveContainer" containerID="b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.856826 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff"} err="failed to get container status \"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\": rpc error: code = NotFound desc = could not find container \"b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff\": container with ID starting with b190618612edbbd5c6918a02199f121f839c9de49eba3924001cdb9a847181ff not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.856889 5008 scope.go:117] "RemoveContainer" containerID="466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.857343 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a"} err="failed to get container status \"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\": rpc error: code = NotFound desc = could not find container \"466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a\": container with ID starting with 466d45bd557b03b401a009438a477a18d8c5b144218fb025ead61b11c0e7d39a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.857363 5008 scope.go:117] "RemoveContainer" containerID="65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.857788 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b"} err="failed to get container status \"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\": rpc error: code = NotFound desc = could not find container \"65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b\": container with ID starting with 65eabedfbd5bf8f93d0ce03fe9e5f090b865fae76ea53a3fd573d60aa6e96a9b not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.857823 5008 scope.go:117] "RemoveContainer" containerID="ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.858320 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd"} err="failed to get container status \"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\": rpc error: code = NotFound desc = could not find container \"ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd\": container with ID starting with ea0dd591fddccac92ebfb0c115f33bd3caeddbc2e462efa925ad0f7f98cb71fd not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.858342 5008 scope.go:117] "RemoveContainer" containerID="ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.858642 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a"} err="failed to get container status \"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\": rpc error: code = NotFound desc = could not find container \"ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a\": container with ID starting with ced142099deead2dd87882c3a21658f9e02658572d456db1b05d29de17a1b61a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.858662 5008 scope.go:117] "RemoveContainer" containerID="9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.858972 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f"} err="failed to get container status \"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\": rpc error: code = NotFound desc = could not find container \"9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f\": container with ID starting with 9f7a2ac14747ea769179c0009d8d39a5b7e29e75909ec4915a42bf8ce453185f not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.859025 5008 scope.go:117] "RemoveContainer" containerID="e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.859394 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f"} err="failed to get container status \"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\": rpc error: code = NotFound desc = could not find container \"e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f\": container with ID starting with e246e505741356d9b49d3ca2ef2e4688827af0b55c90656bac77b96f13c1e13f not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.859415 5008 scope.go:117] "RemoveContainer" containerID="69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.859701 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a"} err="failed to get container status \"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\": rpc error: code = NotFound desc = could not find container \"69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a\": container with ID starting with 69cc15a7224e2a4b12d450beaadaff100369a2404059b34b042cdd849f13120a not found: ID does not exist" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.859716 5008 scope.go:117] "RemoveContainer" containerID="568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78" Mar 18 18:16:21 crc kubenswrapper[5008]: I0318 18:16:21.860070 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78"} err="failed to get container status \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": rpc error: code = NotFound desc = could not find container \"568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78\": container with ID starting with 568190db9536aa63580bcaa0690d18680406b18fe35064917b2a4939b5496a78 not found: ID does not exist" Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.095701 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.095742 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.153604 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.209313 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b105c010-f5cb-41ae-bdff-62bc05da91a1" path="/var/lib/kubelet/pods/b105c010-f5cb-41ae-bdff-62bc05da91a1/volumes" Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.591787 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/2.log" Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.592911 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/1.log" Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.593080 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgv8s" event={"ID":"9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd","Type":"ContainerStarted","Data":"34d324e2886658dc43ed62469555fc1739eb671f888174a95ad2f54bc5c24764"} Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.594613 5008 generic.go:334] "Generic (PLEG): container finished" podID="0128d806-9156-4017-ac52-1f5f009d6999" containerID="4280aa802066a7cb769ee78356cd1be6476b065a18430ed3de2817b85b2d1f1d" exitCode=0 Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.594958 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerDied","Data":"4280aa802066a7cb769ee78356cd1be6476b065a18430ed3de2817b85b2d1f1d"} Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.646224 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:22 crc kubenswrapper[5008]: I0318 18:16:22.725014 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sl5k5"] Mar 18 18:16:23 crc kubenswrapper[5008]: I0318 18:16:23.603597 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"005f9652d84a08eaa9c48dbde7aff7289d6f99544f3bfdeceeb59de72b126e5f"} Mar 18 18:16:23 crc kubenswrapper[5008]: I0318 18:16:23.604056 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"f844e3ae5b0962987b257e50ce63bdb46f6c073886c3b48723330da743b9b343"} Mar 18 18:16:23 crc kubenswrapper[5008]: I0318 18:16:23.604075 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"9d00d1fc5521055d833be3dde7664612205728b0c1fef32e92852e9ab6a9b42c"} Mar 18 18:16:23 crc kubenswrapper[5008]: I0318 18:16:23.604089 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"55550502d346427b39fe19aa8fed2f4c2b679f1e6c89226b3ccc2dcdac288666"} Mar 18 18:16:23 crc kubenswrapper[5008]: I0318 18:16:23.604103 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"17863f5e3072577c827c4e22e767c4b5aa811c1e9fbea3f63e1a9e922c6560cd"} Mar 18 18:16:24 crc kubenswrapper[5008]: I0318 18:16:24.611908 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"e34fc390e1f393e56caa56a2914417dbec1ffe3068c21f24cab852c5d7c2a036"} Mar 18 18:16:24 crc kubenswrapper[5008]: I0318 18:16:24.612034 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sl5k5" podUID="4d945b17-5c72-42b2-84ce-137447fdb572" containerName="registry-server" containerID="cri-o://eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb" gracePeriod=2 Mar 18 18:16:24 crc kubenswrapper[5008]: I0318 18:16:24.822031 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:24.991240 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-utilities\") pod \"4d945b17-5c72-42b2-84ce-137447fdb572\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:24.991300 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-catalog-content\") pod \"4d945b17-5c72-42b2-84ce-137447fdb572\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:24.991333 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbq9r\" (UniqueName: \"kubernetes.io/projected/4d945b17-5c72-42b2-84ce-137447fdb572-kube-api-access-wbq9r\") pod \"4d945b17-5c72-42b2-84ce-137447fdb572\" (UID: \"4d945b17-5c72-42b2-84ce-137447fdb572\") " Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:24.992672 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-utilities" (OuterVolumeSpecName: "utilities") pod "4d945b17-5c72-42b2-84ce-137447fdb572" (UID: "4d945b17-5c72-42b2-84ce-137447fdb572"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.000096 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d945b17-5c72-42b2-84ce-137447fdb572-kube-api-access-wbq9r" (OuterVolumeSpecName: "kube-api-access-wbq9r") pod "4d945b17-5c72-42b2-84ce-137447fdb572" (UID: "4d945b17-5c72-42b2-84ce-137447fdb572"). InnerVolumeSpecName "kube-api-access-wbq9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.092236 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.092273 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbq9r\" (UniqueName: \"kubernetes.io/projected/4d945b17-5c72-42b2-84ce-137447fdb572-kube-api-access-wbq9r\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.490321 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d945b17-5c72-42b2-84ce-137447fdb572" (UID: "4d945b17-5c72-42b2-84ce-137447fdb572"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.496937 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d945b17-5c72-42b2-84ce-137447fdb572-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.622626 5008 generic.go:334] "Generic (PLEG): container finished" podID="4d945b17-5c72-42b2-84ce-137447fdb572" containerID="eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb" exitCode=0 Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.622677 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl5k5" event={"ID":"4d945b17-5c72-42b2-84ce-137447fdb572","Type":"ContainerDied","Data":"eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb"} Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.622707 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sl5k5" event={"ID":"4d945b17-5c72-42b2-84ce-137447fdb572","Type":"ContainerDied","Data":"ba148802b49cab30ce134ec246705e73372637c4063e48e9230cda3d12cce2eb"} Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.622731 5008 scope.go:117] "RemoveContainer" containerID="eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.622745 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sl5k5" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.653959 5008 scope.go:117] "RemoveContainer" containerID="8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.679376 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sl5k5"] Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.681209 5008 scope.go:117] "RemoveContainer" containerID="b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.693692 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sl5k5"] Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.710814 5008 scope.go:117] "RemoveContainer" containerID="eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb" Mar 18 18:16:25 crc kubenswrapper[5008]: E0318 18:16:25.711455 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb\": container with ID starting with eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb not found: ID does not exist" containerID="eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.711507 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb"} err="failed to get container status \"eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb\": rpc error: code = NotFound desc = could not find container \"eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb\": container with ID starting with eeba34f06012ba3033bc8be6c8cf36196f207a7fea8e3b2865adbb6c4d3f02eb not found: ID does not exist" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.711537 5008 scope.go:117] "RemoveContainer" containerID="8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8" Mar 18 18:16:25 crc kubenswrapper[5008]: E0318 18:16:25.712079 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8\": container with ID starting with 8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8 not found: ID does not exist" containerID="8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.712126 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8"} err="failed to get container status \"8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8\": rpc error: code = NotFound desc = could not find container \"8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8\": container with ID starting with 8e803ec6f5825a3617a21fd291839244431efd39eb5fb09771a5b025fc3739f8 not found: ID does not exist" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.712163 5008 scope.go:117] "RemoveContainer" containerID="b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc" Mar 18 18:16:25 crc kubenswrapper[5008]: E0318 18:16:25.712971 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc\": container with ID starting with b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc not found: ID does not exist" containerID="b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc" Mar 18 18:16:25 crc kubenswrapper[5008]: I0318 18:16:25.713135 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc"} err="failed to get container status \"b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc\": rpc error: code = NotFound desc = could not find container \"b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc\": container with ID starting with b4fe266b16a91760496d8c6dda8a107d25c2035d4a7733e29501244a569cedbc not found: ID does not exist" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.208226 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d945b17-5c72-42b2-84ce-137447fdb572" path="/var/lib/kubelet/pods/4d945b17-5c72-42b2-84ce-137447fdb572/volumes" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.497365 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-wrdt5"] Mar 18 18:16:26 crc kubenswrapper[5008]: E0318 18:16:26.497631 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d945b17-5c72-42b2-84ce-137447fdb572" containerName="extract-utilities" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.497656 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d945b17-5c72-42b2-84ce-137447fdb572" containerName="extract-utilities" Mar 18 18:16:26 crc kubenswrapper[5008]: E0318 18:16:26.497669 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d945b17-5c72-42b2-84ce-137447fdb572" containerName="registry-server" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.497679 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d945b17-5c72-42b2-84ce-137447fdb572" containerName="registry-server" Mar 18 18:16:26 crc kubenswrapper[5008]: E0318 18:16:26.497691 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d945b17-5c72-42b2-84ce-137447fdb572" containerName="extract-content" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.497699 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d945b17-5c72-42b2-84ce-137447fdb572" containerName="extract-content" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.497819 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d945b17-5c72-42b2-84ce-137447fdb572" containerName="registry-server" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.498225 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.500628 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.500657 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.500774 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.501531 5008 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-mk8ls" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.509522 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28h49\" (UniqueName: \"kubernetes.io/projected/2d783fba-34bb-4969-a6eb-59c96cd838f6-kube-api-access-28h49\") pod \"crc-storage-crc-wrdt5\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.509663 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2d783fba-34bb-4969-a6eb-59c96cd838f6-crc-storage\") pod \"crc-storage-crc-wrdt5\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.509772 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2d783fba-34bb-4969-a6eb-59c96cd838f6-node-mnt\") pod \"crc-storage-crc-wrdt5\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.643635 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28h49\" (UniqueName: \"kubernetes.io/projected/2d783fba-34bb-4969-a6eb-59c96cd838f6-kube-api-access-28h49\") pod \"crc-storage-crc-wrdt5\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.644629 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2d783fba-34bb-4969-a6eb-59c96cd838f6-crc-storage\") pod \"crc-storage-crc-wrdt5\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.644770 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2d783fba-34bb-4969-a6eb-59c96cd838f6-node-mnt\") pod \"crc-storage-crc-wrdt5\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.645100 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2d783fba-34bb-4969-a6eb-59c96cd838f6-node-mnt\") pod \"crc-storage-crc-wrdt5\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.645971 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2d783fba-34bb-4969-a6eb-59c96cd838f6-crc-storage\") pod \"crc-storage-crc-wrdt5\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.650601 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"07b31d7897bc60a9863c210ded84e4e5dccc4904bba0bd768b80f6173baf7d7a"} Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.664764 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28h49\" (UniqueName: \"kubernetes.io/projected/2d783fba-34bb-4969-a6eb-59c96cd838f6-kube-api-access-28h49\") pod \"crc-storage-crc-wrdt5\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: I0318 18:16:26.812118 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: E0318 18:16:26.832908 5008 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wrdt5_crc-storage_2d783fba-34bb-4969-a6eb-59c96cd838f6_0(7a894038e28ecc03a51495516f137b66865df348e141f18ce470d7ea3b5f1183): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:16:26 crc kubenswrapper[5008]: E0318 18:16:26.832981 5008 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wrdt5_crc-storage_2d783fba-34bb-4969-a6eb-59c96cd838f6_0(7a894038e28ecc03a51495516f137b66865df348e141f18ce470d7ea3b5f1183): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: E0318 18:16:26.833007 5008 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wrdt5_crc-storage_2d783fba-34bb-4969-a6eb-59c96cd838f6_0(7a894038e28ecc03a51495516f137b66865df348e141f18ce470d7ea3b5f1183): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:26 crc kubenswrapper[5008]: E0318 18:16:26.833057 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-wrdt5_crc-storage(2d783fba-34bb-4969-a6eb-59c96cd838f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-wrdt5_crc-storage(2d783fba-34bb-4969-a6eb-59c96cd838f6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wrdt5_crc-storage_2d783fba-34bb-4969-a6eb-59c96cd838f6_0(7a894038e28ecc03a51495516f137b66865df348e141f18ce470d7ea3b5f1183): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-wrdt5" podUID="2d783fba-34bb-4969-a6eb-59c96cd838f6" Mar 18 18:16:28 crc kubenswrapper[5008]: I0318 18:16:28.529473 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:28 crc kubenswrapper[5008]: I0318 18:16:28.600253 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:28 crc kubenswrapper[5008]: I0318 18:16:28.668644 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" event={"ID":"0128d806-9156-4017-ac52-1f5f009d6999","Type":"ContainerStarted","Data":"6e8895a08d56bbe5bd4b428f9f8009ca41e4e2831946fbc45d3cf2c7d1db924c"} Mar 18 18:16:28 crc kubenswrapper[5008]: I0318 18:16:28.701263 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" podStartSLOduration=8.701248482 podStartE2EDuration="8.701248482s" podCreationTimestamp="2026-03-18 18:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:16:28.70033743 +0000 UTC m=+845.219810509" watchObservedRunningTime="2026-03-18 18:16:28.701248482 +0000 UTC m=+845.220721561" Mar 18 18:16:28 crc kubenswrapper[5008]: I0318 18:16:28.788183 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92gtg"] Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.630472 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wrdt5"] Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.630680 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.631234 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:29 crc kubenswrapper[5008]: E0318 18:16:29.666206 5008 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wrdt5_crc-storage_2d783fba-34bb-4969-a6eb-59c96cd838f6_0(c774b89f1a97e8a525a5ecd253e168f76f07fd1eda330e46950fe00e476f6296): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 18:16:29 crc kubenswrapper[5008]: E0318 18:16:29.666324 5008 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wrdt5_crc-storage_2d783fba-34bb-4969-a6eb-59c96cd838f6_0(c774b89f1a97e8a525a5ecd253e168f76f07fd1eda330e46950fe00e476f6296): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:29 crc kubenswrapper[5008]: E0318 18:16:29.666377 5008 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wrdt5_crc-storage_2d783fba-34bb-4969-a6eb-59c96cd838f6_0(c774b89f1a97e8a525a5ecd253e168f76f07fd1eda330e46950fe00e476f6296): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:29 crc kubenswrapper[5008]: E0318 18:16:29.666478 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-wrdt5_crc-storage(2d783fba-34bb-4969-a6eb-59c96cd838f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-wrdt5_crc-storage(2d783fba-34bb-4969-a6eb-59c96cd838f6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-wrdt5_crc-storage_2d783fba-34bb-4969-a6eb-59c96cd838f6_0(c774b89f1a97e8a525a5ecd253e168f76f07fd1eda330e46950fe00e476f6296): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-wrdt5" podUID="2d783fba-34bb-4969-a6eb-59c96cd838f6" Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.676670 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-92gtg" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerName="registry-server" containerID="cri-o://0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e" gracePeriod=2 Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.677186 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.677256 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.677272 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.712770 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.723863 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.888624 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.995052 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-utilities\") pod \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.995119 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r46q9\" (UniqueName: \"kubernetes.io/projected/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-kube-api-access-r46q9\") pod \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.995211 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-catalog-content\") pod \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\" (UID: \"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967\") " Mar 18 18:16:29 crc kubenswrapper[5008]: I0318 18:16:29.996322 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-utilities" (OuterVolumeSpecName: "utilities") pod "cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" (UID: "cac3c361-c4f3-43dc-9c4a-dd1b56ba0967"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.001702 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-kube-api-access-r46q9" (OuterVolumeSpecName: "kube-api-access-r46q9") pod "cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" (UID: "cac3c361-c4f3-43dc-9c4a-dd1b56ba0967"). InnerVolumeSpecName "kube-api-access-r46q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.097181 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.097242 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r46q9\" (UniqueName: \"kubernetes.io/projected/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-kube-api-access-r46q9\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.130287 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" (UID: "cac3c361-c4f3-43dc-9c4a-dd1b56ba0967"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.199169 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.685003 5008 generic.go:334] "Generic (PLEG): container finished" podID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerID="0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e" exitCode=0 Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.685063 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92gtg" event={"ID":"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967","Type":"ContainerDied","Data":"0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e"} Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.685133 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92gtg" event={"ID":"cac3c361-c4f3-43dc-9c4a-dd1b56ba0967","Type":"ContainerDied","Data":"954df93be977bdff01c4d76b2d087ce172984c26c65f53855061e826b84c49a5"} Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.685113 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92gtg" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.685154 5008 scope.go:117] "RemoveContainer" containerID="0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.707178 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92gtg"] Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.714147 5008 scope.go:117] "RemoveContainer" containerID="3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.720629 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-92gtg"] Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.734856 5008 scope.go:117] "RemoveContainer" containerID="0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.750498 5008 scope.go:117] "RemoveContainer" containerID="0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e" Mar 18 18:16:30 crc kubenswrapper[5008]: E0318 18:16:30.750974 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e\": container with ID starting with 0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e not found: ID does not exist" containerID="0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.751014 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e"} err="failed to get container status \"0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e\": rpc error: code = NotFound desc = could not find container \"0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e\": container with ID starting with 0fac5e9450748f28a7eb36f39fb0abd5b7529e4e9a400bea0c202c9aac19ab0e not found: ID does not exist" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.751040 5008 scope.go:117] "RemoveContainer" containerID="3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f" Mar 18 18:16:30 crc kubenswrapper[5008]: E0318 18:16:30.751648 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f\": container with ID starting with 3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f not found: ID does not exist" containerID="3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.751698 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f"} err="failed to get container status \"3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f\": rpc error: code = NotFound desc = could not find container \"3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f\": container with ID starting with 3eb87bf886d95d6c6b8ac4fb57eb9a81bfa0522e5af18fc8dc41f02a1888ba1f not found: ID does not exist" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.751732 5008 scope.go:117] "RemoveContainer" containerID="0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc" Mar 18 18:16:30 crc kubenswrapper[5008]: E0318 18:16:30.752433 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc\": container with ID starting with 0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc not found: ID does not exist" containerID="0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc" Mar 18 18:16:30 crc kubenswrapper[5008]: I0318 18:16:30.752529 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc"} err="failed to get container status \"0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc\": rpc error: code = NotFound desc = could not find container \"0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc\": container with ID starting with 0ffdaed5ad2591f37a4e9a3c349a45e28d1415e5fd9292ad498ef45c515c80cc not found: ID does not exist" Mar 18 18:16:32 crc kubenswrapper[5008]: I0318 18:16:32.221658 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" path="/var/lib/kubelet/pods/cac3c361-c4f3-43dc-9c4a-dd1b56ba0967/volumes" Mar 18 18:16:36 crc kubenswrapper[5008]: I0318 18:16:36.278440 5008 scope.go:117] "RemoveContainer" containerID="e2234aa472c18db948eca55728aafb0b9ce11cc668725b780d9ab170bb87bdc3" Mar 18 18:16:36 crc kubenswrapper[5008]: I0318 18:16:36.334937 5008 scope.go:117] "RemoveContainer" containerID="49e87bbd2ba41b38445ad4d5a4cac446075b67c32199067b8e9316283fdc1d0b" Mar 18 18:16:36 crc kubenswrapper[5008]: I0318 18:16:36.739434 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgv8s_9b8d2b81-71c9-44b4-86ad-8a3ec4c0c2dd/kube-multus/2.log" Mar 18 18:16:44 crc kubenswrapper[5008]: I0318 18:16:44.197407 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:44 crc kubenswrapper[5008]: I0318 18:16:44.204196 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:44 crc kubenswrapper[5008]: I0318 18:16:44.686426 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wrdt5"] Mar 18 18:16:44 crc kubenswrapper[5008]: I0318 18:16:44.805232 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wrdt5" event={"ID":"2d783fba-34bb-4969-a6eb-59c96cd838f6","Type":"ContainerStarted","Data":"3d4ff35a1de86932a93da636dc2ef1ed01974b7a05c0bf924527bd557887bc62"} Mar 18 18:16:46 crc kubenswrapper[5008]: I0318 18:16:46.817520 5008 generic.go:334] "Generic (PLEG): container finished" podID="2d783fba-34bb-4969-a6eb-59c96cd838f6" containerID="c21904ad77e04ffffcedfc97a0b530ee6c7a3f64dac9b6dcde055b60ae8aeae5" exitCode=0 Mar 18 18:16:46 crc kubenswrapper[5008]: I0318 18:16:46.817642 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wrdt5" event={"ID":"2d783fba-34bb-4969-a6eb-59c96cd838f6","Type":"ContainerDied","Data":"c21904ad77e04ffffcedfc97a0b530ee6c7a3f64dac9b6dcde055b60ae8aeae5"} Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.082059 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.242871 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28h49\" (UniqueName: \"kubernetes.io/projected/2d783fba-34bb-4969-a6eb-59c96cd838f6-kube-api-access-28h49\") pod \"2d783fba-34bb-4969-a6eb-59c96cd838f6\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.242924 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2d783fba-34bb-4969-a6eb-59c96cd838f6-crc-storage\") pod \"2d783fba-34bb-4969-a6eb-59c96cd838f6\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.242991 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2d783fba-34bb-4969-a6eb-59c96cd838f6-node-mnt\") pod \"2d783fba-34bb-4969-a6eb-59c96cd838f6\" (UID: \"2d783fba-34bb-4969-a6eb-59c96cd838f6\") " Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.243270 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d783fba-34bb-4969-a6eb-59c96cd838f6-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "2d783fba-34bb-4969-a6eb-59c96cd838f6" (UID: "2d783fba-34bb-4969-a6eb-59c96cd838f6"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.249822 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d783fba-34bb-4969-a6eb-59c96cd838f6-kube-api-access-28h49" (OuterVolumeSpecName: "kube-api-access-28h49") pod "2d783fba-34bb-4969-a6eb-59c96cd838f6" (UID: "2d783fba-34bb-4969-a6eb-59c96cd838f6"). InnerVolumeSpecName "kube-api-access-28h49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.259209 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d783fba-34bb-4969-a6eb-59c96cd838f6-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "2d783fba-34bb-4969-a6eb-59c96cd838f6" (UID: "2d783fba-34bb-4969-a6eb-59c96cd838f6"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.344051 5008 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2d783fba-34bb-4969-a6eb-59c96cd838f6-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.344097 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28h49\" (UniqueName: \"kubernetes.io/projected/2d783fba-34bb-4969-a6eb-59c96cd838f6-kube-api-access-28h49\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.344110 5008 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2d783fba-34bb-4969-a6eb-59c96cd838f6-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.836301 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wrdt5" event={"ID":"2d783fba-34bb-4969-a6eb-59c96cd838f6","Type":"ContainerDied","Data":"3d4ff35a1de86932a93da636dc2ef1ed01974b7a05c0bf924527bd557887bc62"} Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.836361 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d4ff35a1de86932a93da636dc2ef1ed01974b7a05c0bf924527bd557887bc62" Mar 18 18:16:48 crc kubenswrapper[5008]: I0318 18:16:48.836420 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wrdt5" Mar 18 18:16:51 crc kubenswrapper[5008]: I0318 18:16:51.281227 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4twbz" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.274859 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q"] Mar 18 18:16:56 crc kubenswrapper[5008]: E0318 18:16:56.277087 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerName="registry-server" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.277126 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerName="registry-server" Mar 18 18:16:56 crc kubenswrapper[5008]: E0318 18:16:56.277147 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerName="extract-content" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.277156 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerName="extract-content" Mar 18 18:16:56 crc kubenswrapper[5008]: E0318 18:16:56.277168 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d783fba-34bb-4969-a6eb-59c96cd838f6" containerName="storage" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.277176 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d783fba-34bb-4969-a6eb-59c96cd838f6" containerName="storage" Mar 18 18:16:56 crc kubenswrapper[5008]: E0318 18:16:56.277189 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerName="extract-utilities" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.277197 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerName="extract-utilities" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.277303 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d783fba-34bb-4969-a6eb-59c96cd838f6" containerName="storage" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.277323 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac3c361-c4f3-43dc-9c4a-dd1b56ba0967" containerName="registry-server" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.278094 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.281286 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.287737 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q"] Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.451630 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.451735 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.451814 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kngm\" (UniqueName: \"kubernetes.io/projected/04389aab-de3f-49c6-ab43-d978cd41f38d-kube-api-access-8kngm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.554355 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.554610 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.554747 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kngm\" (UniqueName: \"kubernetes.io/projected/04389aab-de3f-49c6-ab43-d978cd41f38d-kube-api-access-8kngm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.554913 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.555354 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.593928 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kngm\" (UniqueName: \"kubernetes.io/projected/04389aab-de3f-49c6-ab43-d978cd41f38d-kube-api-access-8kngm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.605101 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.791077 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q"] Mar 18 18:16:56 crc kubenswrapper[5008]: I0318 18:16:56.886915 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" event={"ID":"04389aab-de3f-49c6-ab43-d978cd41f38d","Type":"ContainerStarted","Data":"84e75c60c066a2eb2091f623645d52a3a9ad71d4916ceb5408041e35b2658065"} Mar 18 18:16:57 crc kubenswrapper[5008]: I0318 18:16:57.897103 5008 generic.go:334] "Generic (PLEG): container finished" podID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerID="4fac62189f59cbc081d81c8785587885358b13bc4daa34710ba3e6c4409c175d" exitCode=0 Mar 18 18:16:57 crc kubenswrapper[5008]: I0318 18:16:57.897158 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" event={"ID":"04389aab-de3f-49c6-ab43-d978cd41f38d","Type":"ContainerDied","Data":"4fac62189f59cbc081d81c8785587885358b13bc4daa34710ba3e6c4409c175d"} Mar 18 18:17:06 crc kubenswrapper[5008]: I0318 18:17:06.958192 5008 generic.go:334] "Generic (PLEG): container finished" podID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerID="2ca9f023552e024451412fd9c5bc14f0b70611b677098cb551b8111963cbafa3" exitCode=0 Mar 18 18:17:06 crc kubenswrapper[5008]: I0318 18:17:06.958254 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" event={"ID":"04389aab-de3f-49c6-ab43-d978cd41f38d","Type":"ContainerDied","Data":"2ca9f023552e024451412fd9c5bc14f0b70611b677098cb551b8111963cbafa3"} Mar 18 18:17:07 crc kubenswrapper[5008]: I0318 18:17:07.968725 5008 generic.go:334] "Generic (PLEG): container finished" podID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerID="568a4642c00addee2af801e001833b4e4c9577be580650f6f58cb5574e3c6965" exitCode=0 Mar 18 18:17:07 crc kubenswrapper[5008]: I0318 18:17:07.968844 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" event={"ID":"04389aab-de3f-49c6-ab43-d978cd41f38d","Type":"ContainerDied","Data":"568a4642c00addee2af801e001833b4e4c9577be580650f6f58cb5574e3c6965"} Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.266464 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.434761 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-bundle\") pod \"04389aab-de3f-49c6-ab43-d978cd41f38d\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.434884 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kngm\" (UniqueName: \"kubernetes.io/projected/04389aab-de3f-49c6-ab43-d978cd41f38d-kube-api-access-8kngm\") pod \"04389aab-de3f-49c6-ab43-d978cd41f38d\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.434955 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-util\") pod \"04389aab-de3f-49c6-ab43-d978cd41f38d\" (UID: \"04389aab-de3f-49c6-ab43-d978cd41f38d\") " Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.436050 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-bundle" (OuterVolumeSpecName: "bundle") pod "04389aab-de3f-49c6-ab43-d978cd41f38d" (UID: "04389aab-de3f-49c6-ab43-d978cd41f38d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.444773 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04389aab-de3f-49c6-ab43-d978cd41f38d-kube-api-access-8kngm" (OuterVolumeSpecName: "kube-api-access-8kngm") pod "04389aab-de3f-49c6-ab43-d978cd41f38d" (UID: "04389aab-de3f-49c6-ab43-d978cd41f38d"). InnerVolumeSpecName "kube-api-access-8kngm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.457606 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-util" (OuterVolumeSpecName: "util") pod "04389aab-de3f-49c6-ab43-d978cd41f38d" (UID: "04389aab-de3f-49c6-ab43-d978cd41f38d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.536406 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.536461 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kngm\" (UniqueName: \"kubernetes.io/projected/04389aab-de3f-49c6-ab43-d978cd41f38d-kube-api-access-8kngm\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:09 crc kubenswrapper[5008]: I0318 18:17:09.536481 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04389aab-de3f-49c6-ab43-d978cd41f38d-util\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:10 crc kubenswrapper[5008]: I0318 18:17:10.000277 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" event={"ID":"04389aab-de3f-49c6-ab43-d978cd41f38d","Type":"ContainerDied","Data":"84e75c60c066a2eb2091f623645d52a3a9ad71d4916ceb5408041e35b2658065"} Mar 18 18:17:10 crc kubenswrapper[5008]: I0318 18:17:10.000339 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84e75c60c066a2eb2091f623645d52a3a9ad71d4916ceb5408041e35b2658065" Mar 18 18:17:10 crc kubenswrapper[5008]: I0318 18:17:10.000310 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.903923 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln"] Mar 18 18:17:12 crc kubenswrapper[5008]: E0318 18:17:12.904531 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerName="pull" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.904547 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerName="pull" Mar 18 18:17:12 crc kubenswrapper[5008]: E0318 18:17:12.904618 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerName="util" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.904626 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerName="util" Mar 18 18:17:12 crc kubenswrapper[5008]: E0318 18:17:12.904638 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerName="extract" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.904646 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerName="extract" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.904764 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="04389aab-de3f-49c6-ab43-d978cd41f38d" containerName="extract" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.905186 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.907288 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7twbm" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.907486 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.907696 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 18:17:12 crc kubenswrapper[5008]: I0318 18:17:12.915261 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln"] Mar 18 18:17:13 crc kubenswrapper[5008]: I0318 18:17:13.079703 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkjn7\" (UniqueName: \"kubernetes.io/projected/30844b92-1089-4729-9e30-38cb366fbe0f-kube-api-access-mkjn7\") pod \"nmstate-operator-796d4cfff4-jq4ln\" (UID: \"30844b92-1089-4729-9e30-38cb366fbe0f\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln" Mar 18 18:17:13 crc kubenswrapper[5008]: I0318 18:17:13.181280 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkjn7\" (UniqueName: \"kubernetes.io/projected/30844b92-1089-4729-9e30-38cb366fbe0f-kube-api-access-mkjn7\") pod \"nmstate-operator-796d4cfff4-jq4ln\" (UID: \"30844b92-1089-4729-9e30-38cb366fbe0f\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln" Mar 18 18:17:13 crc kubenswrapper[5008]: I0318 18:17:13.202789 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkjn7\" (UniqueName: \"kubernetes.io/projected/30844b92-1089-4729-9e30-38cb366fbe0f-kube-api-access-mkjn7\") pod \"nmstate-operator-796d4cfff4-jq4ln\" (UID: \"30844b92-1089-4729-9e30-38cb366fbe0f\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln" Mar 18 18:17:13 crc kubenswrapper[5008]: I0318 18:17:13.225336 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln" Mar 18 18:17:13 crc kubenswrapper[5008]: I0318 18:17:13.406411 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln"] Mar 18 18:17:13 crc kubenswrapper[5008]: I0318 18:17:13.416959 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:17:14 crc kubenswrapper[5008]: I0318 18:17:14.024414 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln" event={"ID":"30844b92-1089-4729-9e30-38cb366fbe0f","Type":"ContainerStarted","Data":"0b9bc49d07148e4157b495c87fb70ddfc8f61b917e4644796eec427cf0edd2b0"} Mar 18 18:17:17 crc kubenswrapper[5008]: I0318 18:17:17.048980 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln" event={"ID":"30844b92-1089-4729-9e30-38cb366fbe0f","Type":"ContainerStarted","Data":"1442cd985902ada612f44f05376c4e13cec319ab48c1aa1843782ad2f64d9800"} Mar 18 18:17:17 crc kubenswrapper[5008]: I0318 18:17:17.066066 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jq4ln" podStartSLOduration=2.536781887 podStartE2EDuration="5.066049315s" podCreationTimestamp="2026-03-18 18:17:12 +0000 UTC" firstStartedPulling="2026-03-18 18:17:13.41670739 +0000 UTC m=+889.936180469" lastFinishedPulling="2026-03-18 18:17:15.945974808 +0000 UTC m=+892.465447897" observedRunningTime="2026-03-18 18:17:17.064796481 +0000 UTC m=+893.584269590" watchObservedRunningTime="2026-03-18 18:17:17.066049315 +0000 UTC m=+893.585522394" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.015229 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv"] Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.015996 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.017984 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-nkrrv" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.035757 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv"] Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.041008 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5"] Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.042016 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.053140 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.056793 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5"] Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.073051 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-dfm4p"] Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.073913 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.141420 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnfsr\" (UniqueName: \"kubernetes.io/projected/eb0f4cc1-7468-434f-bc63-7c3575621186-kube-api-access-fnfsr\") pod \"nmstate-webhook-5f558f5558-vnfv5\" (UID: \"eb0f4cc1-7468-434f-bc63-7c3575621186\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.141891 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/eb0f4cc1-7468-434f-bc63-7c3575621186-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-vnfv5\" (UID: \"eb0f4cc1-7468-434f-bc63-7c3575621186\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.141983 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jccpf\" (UniqueName: \"kubernetes.io/projected/ed341cb5-f441-4f05-951b-973883b19672-kube-api-access-jccpf\") pod \"nmstate-metrics-9b8c8685d-cqxqv\" (UID: \"ed341cb5-f441-4f05-951b-973883b19672\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.152622 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq"] Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.153389 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.157902 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.157933 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.158118 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-t5fms" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.196786 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq"] Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.243652 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnfsr\" (UniqueName: \"kubernetes.io/projected/eb0f4cc1-7468-434f-bc63-7c3575621186-kube-api-access-fnfsr\") pod \"nmstate-webhook-5f558f5558-vnfv5\" (UID: \"eb0f4cc1-7468-434f-bc63-7c3575621186\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.243731 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/eb0f4cc1-7468-434f-bc63-7c3575621186-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-vnfv5\" (UID: \"eb0f4cc1-7468-434f-bc63-7c3575621186\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.243759 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c698aa79-da33-410b-86db-38e9fd3a4806-nmstate-lock\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.243775 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c698aa79-da33-410b-86db-38e9fd3a4806-ovs-socket\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.243798 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jccpf\" (UniqueName: \"kubernetes.io/projected/ed341cb5-f441-4f05-951b-973883b19672-kube-api-access-jccpf\") pod \"nmstate-metrics-9b8c8685d-cqxqv\" (UID: \"ed341cb5-f441-4f05-951b-973883b19672\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.243816 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qn9l\" (UniqueName: \"kubernetes.io/projected/c698aa79-da33-410b-86db-38e9fd3a4806-kube-api-access-7qn9l\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.243863 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c698aa79-da33-410b-86db-38e9fd3a4806-dbus-socket\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.263334 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/eb0f4cc1-7468-434f-bc63-7c3575621186-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-vnfv5\" (UID: \"eb0f4cc1-7468-434f-bc63-7c3575621186\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.264232 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jccpf\" (UniqueName: \"kubernetes.io/projected/ed341cb5-f441-4f05-951b-973883b19672-kube-api-access-jccpf\") pod \"nmstate-metrics-9b8c8685d-cqxqv\" (UID: \"ed341cb5-f441-4f05-951b-973883b19672\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.266381 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnfsr\" (UniqueName: \"kubernetes.io/projected/eb0f4cc1-7468-434f-bc63-7c3575621186-kube-api-access-fnfsr\") pod \"nmstate-webhook-5f558f5558-vnfv5\" (UID: \"eb0f4cc1-7468-434f-bc63-7c3575621186\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.338358 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.344769 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq9mw\" (UniqueName: \"kubernetes.io/projected/72afd8eb-d07a-4828-ab98-c094097c937d-kube-api-access-fq9mw\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.344822 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c698aa79-da33-410b-86db-38e9fd3a4806-dbus-socket\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.344876 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c698aa79-da33-410b-86db-38e9fd3a4806-nmstate-lock\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.344889 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c698aa79-da33-410b-86db-38e9fd3a4806-ovs-socket\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.344906 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/72afd8eb-d07a-4828-ab98-c094097c937d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.344929 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/72afd8eb-d07a-4828-ab98-c094097c937d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.344945 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qn9l\" (UniqueName: \"kubernetes.io/projected/c698aa79-da33-410b-86db-38e9fd3a4806-kube-api-access-7qn9l\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.345315 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c698aa79-da33-410b-86db-38e9fd3a4806-ovs-socket\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.345484 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c698aa79-da33-410b-86db-38e9fd3a4806-nmstate-lock\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.345628 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c698aa79-da33-410b-86db-38e9fd3a4806-dbus-socket\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.363257 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.386461 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qn9l\" (UniqueName: \"kubernetes.io/projected/c698aa79-da33-410b-86db-38e9fd3a4806-kube-api-access-7qn9l\") pod \"nmstate-handler-dfm4p\" (UID: \"c698aa79-da33-410b-86db-38e9fd3a4806\") " pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.397484 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.405298 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-9876dd4d8-vcjr8"] Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.406674 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.414160 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9876dd4d8-vcjr8"] Mar 18 18:17:18 crc kubenswrapper[5008]: W0318 18:17:18.436693 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc698aa79_da33_410b_86db_38e9fd3a4806.slice/crio-ebd56f4cef2f4e0883efbb7b685051847d6b79e10b77736dd2e7efc837696a99 WatchSource:0}: Error finding container ebd56f4cef2f4e0883efbb7b685051847d6b79e10b77736dd2e7efc837696a99: Status 404 returned error can't find the container with id ebd56f4cef2f4e0883efbb7b685051847d6b79e10b77736dd2e7efc837696a99 Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.446901 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/72afd8eb-d07a-4828-ab98-c094097c937d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.446936 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/72afd8eb-d07a-4828-ab98-c094097c937d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.446969 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq9mw\" (UniqueName: \"kubernetes.io/projected/72afd8eb-d07a-4828-ab98-c094097c937d-kube-api-access-fq9mw\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: E0318 18:17:18.447236 5008 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 18 18:17:18 crc kubenswrapper[5008]: E0318 18:17:18.447279 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72afd8eb-d07a-4828-ab98-c094097c937d-plugin-serving-cert podName:72afd8eb-d07a-4828-ab98-c094097c937d nodeName:}" failed. No retries permitted until 2026-03-18 18:17:18.947264133 +0000 UTC m=+895.466737202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/72afd8eb-d07a-4828-ab98-c094097c937d-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-p95vq" (UID: "72afd8eb-d07a-4828-ab98-c094097c937d") : secret "plugin-serving-cert" not found Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.453240 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/72afd8eb-d07a-4828-ab98-c094097c937d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.468917 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq9mw\" (UniqueName: \"kubernetes.io/projected/72afd8eb-d07a-4828-ab98-c094097c937d-kube-api-access-fq9mw\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.548473 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-console-config\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.548841 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-trusted-ca-bundle\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.548876 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1fa9e44f-0ccb-46ae-8663-b915a08e2930-console-oauth-config\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.548902 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-oauth-serving-cert\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.548934 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa9e44f-0ccb-46ae-8663-b915a08e2930-console-serving-cert\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.548959 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-service-ca\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.548987 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7tdl\" (UniqueName: \"kubernetes.io/projected/1fa9e44f-0ccb-46ae-8663-b915a08e2930-kube-api-access-m7tdl\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.630520 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv"] Mar 18 18:17:18 crc kubenswrapper[5008]: W0318 18:17:18.636314 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded341cb5_f441_4f05_951b_973883b19672.slice/crio-bebd79b398fc3de1e41b1d0f6e2e23a89d13a9464a3711c944daa518a6d4bb9a WatchSource:0}: Error finding container bebd79b398fc3de1e41b1d0f6e2e23a89d13a9464a3711c944daa518a6d4bb9a: Status 404 returned error can't find the container with id bebd79b398fc3de1e41b1d0f6e2e23a89d13a9464a3711c944daa518a6d4bb9a Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.650761 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-service-ca\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.650811 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tdl\" (UniqueName: \"kubernetes.io/projected/1fa9e44f-0ccb-46ae-8663-b915a08e2930-kube-api-access-m7tdl\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.650838 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-console-config\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.650857 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-trusted-ca-bundle\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.650887 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1fa9e44f-0ccb-46ae-8663-b915a08e2930-console-oauth-config\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.650913 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-oauth-serving-cert\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.650947 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa9e44f-0ccb-46ae-8663-b915a08e2930-console-serving-cert\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.652542 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-trusted-ca-bundle\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.652641 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-service-ca\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.652663 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-oauth-serving-cert\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.652839 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1fa9e44f-0ccb-46ae-8663-b915a08e2930-console-config\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.655730 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1fa9e44f-0ccb-46ae-8663-b915a08e2930-console-oauth-config\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.657633 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa9e44f-0ccb-46ae-8663-b915a08e2930-console-serving-cert\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.665325 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tdl\" (UniqueName: \"kubernetes.io/projected/1fa9e44f-0ccb-46ae-8663-b915a08e2930-kube-api-access-m7tdl\") pod \"console-9876dd4d8-vcjr8\" (UID: \"1fa9e44f-0ccb-46ae-8663-b915a08e2930\") " pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.733729 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.794556 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5"] Mar 18 18:17:18 crc kubenswrapper[5008]: W0318 18:17:18.920831 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa9e44f_0ccb_46ae_8663_b915a08e2930.slice/crio-093b03b8d60e983f2a81c278a718a158fe30e6f3a365312abaaa3417feb15202 WatchSource:0}: Error finding container 093b03b8d60e983f2a81c278a718a158fe30e6f3a365312abaaa3417feb15202: Status 404 returned error can't find the container with id 093b03b8d60e983f2a81c278a718a158fe30e6f3a365312abaaa3417feb15202 Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.921201 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9876dd4d8-vcjr8"] Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.955655 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/72afd8eb-d07a-4828-ab98-c094097c937d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:18 crc kubenswrapper[5008]: I0318 18:17:18.962059 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/72afd8eb-d07a-4828-ab98-c094097c937d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-p95vq\" (UID: \"72afd8eb-d07a-4828-ab98-c094097c937d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:19 crc kubenswrapper[5008]: I0318 18:17:19.061123 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dfm4p" event={"ID":"c698aa79-da33-410b-86db-38e9fd3a4806","Type":"ContainerStarted","Data":"ebd56f4cef2f4e0883efbb7b685051847d6b79e10b77736dd2e7efc837696a99"} Mar 18 18:17:19 crc kubenswrapper[5008]: I0318 18:17:19.062090 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" event={"ID":"eb0f4cc1-7468-434f-bc63-7c3575621186","Type":"ContainerStarted","Data":"1f470d6199eb7562fe441809e34d802e05a4c224c9d37d734058b3fad6dc5a9c"} Mar 18 18:17:19 crc kubenswrapper[5008]: I0318 18:17:19.063216 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9876dd4d8-vcjr8" event={"ID":"1fa9e44f-0ccb-46ae-8663-b915a08e2930","Type":"ContainerStarted","Data":"093b03b8d60e983f2a81c278a718a158fe30e6f3a365312abaaa3417feb15202"} Mar 18 18:17:19 crc kubenswrapper[5008]: I0318 18:17:19.064276 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv" event={"ID":"ed341cb5-f441-4f05-951b-973883b19672","Type":"ContainerStarted","Data":"bebd79b398fc3de1e41b1d0f6e2e23a89d13a9464a3711c944daa518a6d4bb9a"} Mar 18 18:17:19 crc kubenswrapper[5008]: I0318 18:17:19.079739 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" Mar 18 18:17:19 crc kubenswrapper[5008]: I0318 18:17:19.266411 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq"] Mar 18 18:17:19 crc kubenswrapper[5008]: W0318 18:17:19.276826 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72afd8eb_d07a_4828_ab98_c094097c937d.slice/crio-4ba30e29d02a05a13635859d37bc226c182963cf54a09953372ace3969ebad96 WatchSource:0}: Error finding container 4ba30e29d02a05a13635859d37bc226c182963cf54a09953372ace3969ebad96: Status 404 returned error can't find the container with id 4ba30e29d02a05a13635859d37bc226c182963cf54a09953372ace3969ebad96 Mar 18 18:17:20 crc kubenswrapper[5008]: I0318 18:17:20.071726 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" event={"ID":"72afd8eb-d07a-4828-ab98-c094097c937d","Type":"ContainerStarted","Data":"4ba30e29d02a05a13635859d37bc226c182963cf54a09953372ace3969ebad96"} Mar 18 18:17:20 crc kubenswrapper[5008]: I0318 18:17:20.074235 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9876dd4d8-vcjr8" event={"ID":"1fa9e44f-0ccb-46ae-8663-b915a08e2930","Type":"ContainerStarted","Data":"ce5c9bb38aa366ec4054dabe2fd4242802ea09f5e2b1ba23e90d5ddf80493508"} Mar 18 18:17:20 crc kubenswrapper[5008]: I0318 18:17:20.098632 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9876dd4d8-vcjr8" podStartSLOduration=2.098611013 podStartE2EDuration="2.098611013s" podCreationTimestamp="2026-03-18 18:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:17:20.095909781 +0000 UTC m=+896.615382870" watchObservedRunningTime="2026-03-18 18:17:20.098611013 +0000 UTC m=+896.618084112" Mar 18 18:17:22 crc kubenswrapper[5008]: I0318 18:17:22.093593 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dfm4p" event={"ID":"c698aa79-da33-410b-86db-38e9fd3a4806","Type":"ContainerStarted","Data":"3340eb29538c2806f9e22b5997c7f24020b285d5ca499d6a2a7ce338086ffd28"} Mar 18 18:17:22 crc kubenswrapper[5008]: I0318 18:17:22.094260 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:22 crc kubenswrapper[5008]: I0318 18:17:22.096916 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" event={"ID":"eb0f4cc1-7468-434f-bc63-7c3575621186","Type":"ContainerStarted","Data":"d1d556388e5d0c4b437fc0ceb544760f5c3fe3eb4829736e03da0292481c4171"} Mar 18 18:17:22 crc kubenswrapper[5008]: I0318 18:17:22.097000 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:22 crc kubenswrapper[5008]: I0318 18:17:22.099022 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv" event={"ID":"ed341cb5-f441-4f05-951b-973883b19672","Type":"ContainerStarted","Data":"5b9ef808b90f034e9c1a3d570c733df5cd2e413a2aade779c3912f2e6c0646b5"} Mar 18 18:17:22 crc kubenswrapper[5008]: I0318 18:17:22.110324 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-dfm4p" podStartSLOduration=1.3327074030000001 podStartE2EDuration="4.110306117s" podCreationTimestamp="2026-03-18 18:17:18 +0000 UTC" firstStartedPulling="2026-03-18 18:17:18.440947003 +0000 UTC m=+894.960420082" lastFinishedPulling="2026-03-18 18:17:21.218545717 +0000 UTC m=+897.738018796" observedRunningTime="2026-03-18 18:17:22.11007468 +0000 UTC m=+898.629547759" watchObservedRunningTime="2026-03-18 18:17:22.110306117 +0000 UTC m=+898.629779196" Mar 18 18:17:22 crc kubenswrapper[5008]: I0318 18:17:22.134476 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" podStartSLOduration=1.701069291 podStartE2EDuration="4.134415526s" podCreationTimestamp="2026-03-18 18:17:18 +0000 UTC" firstStartedPulling="2026-03-18 18:17:18.81633714 +0000 UTC m=+895.335810229" lastFinishedPulling="2026-03-18 18:17:21.249683385 +0000 UTC m=+897.769156464" observedRunningTime="2026-03-18 18:17:22.129807752 +0000 UTC m=+898.649280861" watchObservedRunningTime="2026-03-18 18:17:22.134415526 +0000 UTC m=+898.653888645" Mar 18 18:17:23 crc kubenswrapper[5008]: I0318 18:17:23.107468 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" event={"ID":"72afd8eb-d07a-4828-ab98-c094097c937d","Type":"ContainerStarted","Data":"71fc60f469af24345b5768e168ad529b8acede206ec3302bb19ce4408a27aa88"} Mar 18 18:17:23 crc kubenswrapper[5008]: I0318 18:17:23.132198 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-p95vq" podStartSLOduration=2.264754847 podStartE2EDuration="5.132177879s" podCreationTimestamp="2026-03-18 18:17:18 +0000 UTC" firstStartedPulling="2026-03-18 18:17:19.279797778 +0000 UTC m=+895.799270877" lastFinishedPulling="2026-03-18 18:17:22.14722083 +0000 UTC m=+898.666693909" observedRunningTime="2026-03-18 18:17:23.12701361 +0000 UTC m=+899.646486709" watchObservedRunningTime="2026-03-18 18:17:23.132177879 +0000 UTC m=+899.651650958" Mar 18 18:17:24 crc kubenswrapper[5008]: I0318 18:17:24.115486 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv" event={"ID":"ed341cb5-f441-4f05-951b-973883b19672","Type":"ContainerStarted","Data":"c9804a05f1d4f1d47e7f7c965e9f09c0889ec1fe235e6619fe15ff6bcb8ce51f"} Mar 18 18:17:24 crc kubenswrapper[5008]: I0318 18:17:24.139527 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-cqxqv" podStartSLOduration=1.190849103 podStartE2EDuration="6.13949905s" podCreationTimestamp="2026-03-18 18:17:18 +0000 UTC" firstStartedPulling="2026-03-18 18:17:18.638987905 +0000 UTC m=+895.158460984" lastFinishedPulling="2026-03-18 18:17:23.587637852 +0000 UTC m=+900.107110931" observedRunningTime="2026-03-18 18:17:24.13428323 +0000 UTC m=+900.653756339" watchObservedRunningTime="2026-03-18 18:17:24.13949905 +0000 UTC m=+900.658972139" Mar 18 18:17:28 crc kubenswrapper[5008]: I0318 18:17:28.430920 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-dfm4p" Mar 18 18:17:28 crc kubenswrapper[5008]: I0318 18:17:28.734058 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:28 crc kubenswrapper[5008]: I0318 18:17:28.734168 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:28 crc kubenswrapper[5008]: I0318 18:17:28.739601 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:29 crc kubenswrapper[5008]: I0318 18:17:29.150159 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9876dd4d8-vcjr8" Mar 18 18:17:29 crc kubenswrapper[5008]: I0318 18:17:29.248206 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gmczr"] Mar 18 18:17:38 crc kubenswrapper[5008]: I0318 18:17:38.372174 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vnfv5" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.345970 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp"] Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.347693 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.349469 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.353825 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp"] Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.516419 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhzt7\" (UniqueName: \"kubernetes.io/projected/a9b926c7-726b-4b59-b7de-0939b357dbc2-kube-api-access-bhzt7\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.516473 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.516506 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.617476 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhzt7\" (UniqueName: \"kubernetes.io/projected/a9b926c7-726b-4b59-b7de-0939b357dbc2-kube-api-access-bhzt7\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.617530 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.617581 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.618509 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.618547 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.655147 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhzt7\" (UniqueName: \"kubernetes.io/projected/a9b926c7-726b-4b59-b7de-0939b357dbc2-kube-api-access-bhzt7\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.676062 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:51 crc kubenswrapper[5008]: I0318 18:17:51.916371 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp"] Mar 18 18:17:51 crc kubenswrapper[5008]: W0318 18:17:51.924212 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9b926c7_726b_4b59_b7de_0939b357dbc2.slice/crio-15660675efb75f20a4a0890ba79a15a0462c8c6cfcbdeb11327486639d129d24 WatchSource:0}: Error finding container 15660675efb75f20a4a0890ba79a15a0462c8c6cfcbdeb11327486639d129d24: Status 404 returned error can't find the container with id 15660675efb75f20a4a0890ba79a15a0462c8c6cfcbdeb11327486639d129d24 Mar 18 18:17:52 crc kubenswrapper[5008]: I0318 18:17:52.337753 5008 generic.go:334] "Generic (PLEG): container finished" podID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerID="ea0c3cd92d56f7e14bfe5ab1020aa402f8810a00452de65ae0d01adb6969e555" exitCode=0 Mar 18 18:17:52 crc kubenswrapper[5008]: I0318 18:17:52.337893 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" event={"ID":"a9b926c7-726b-4b59-b7de-0939b357dbc2","Type":"ContainerDied","Data":"ea0c3cd92d56f7e14bfe5ab1020aa402f8810a00452de65ae0d01adb6969e555"} Mar 18 18:17:52 crc kubenswrapper[5008]: I0318 18:17:52.338165 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" event={"ID":"a9b926c7-726b-4b59-b7de-0939b357dbc2","Type":"ContainerStarted","Data":"15660675efb75f20a4a0890ba79a15a0462c8c6cfcbdeb11327486639d129d24"} Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.296678 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-gmczr" podUID="0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" containerName="console" containerID="cri-o://5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138" gracePeriod=15 Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.353454 5008 generic.go:334] "Generic (PLEG): container finished" podID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerID="e2e1411557f81d0c7768d9a62bcedf31d3569878588196b1136e1148f4f719ed" exitCode=0 Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.353512 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" event={"ID":"a9b926c7-726b-4b59-b7de-0939b357dbc2","Type":"ContainerDied","Data":"e2e1411557f81d0c7768d9a62bcedf31d3569878588196b1136e1148f4f719ed"} Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.460931 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.461004 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.757586 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gmczr_0dfb1aec-81e4-4b51-9a75-afb89d78a1fc/console/0.log" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.757997 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.861690 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vmbl\" (UniqueName: \"kubernetes.io/projected/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-kube-api-access-7vmbl\") pod \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.861812 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-config\") pod \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.861923 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-oauth-config\") pod \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.861957 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-oauth-serving-cert\") pod \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.862000 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-service-ca\") pod \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.862042 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-serving-cert\") pod \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.862123 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-trusted-ca-bundle\") pod \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\" (UID: \"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc\") " Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.863963 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-config" (OuterVolumeSpecName: "console-config") pod "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" (UID: "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.864049 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-service-ca" (OuterVolumeSpecName: "service-ca") pod "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" (UID: "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.864072 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" (UID: "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.864155 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" (UID: "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.869200 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" (UID: "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.869262 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-kube-api-access-7vmbl" (OuterVolumeSpecName: "kube-api-access-7vmbl") pod "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" (UID: "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc"). InnerVolumeSpecName "kube-api-access-7vmbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.869496 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" (UID: "0dfb1aec-81e4-4b51-9a75-afb89d78a1fc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.964113 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vmbl\" (UniqueName: \"kubernetes.io/projected/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-kube-api-access-7vmbl\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.964148 5008 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.964157 5008 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.964165 5008 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.964174 5008 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.964181 5008 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:54 crc kubenswrapper[5008]: I0318 18:17:54.964194 5008 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.360220 5008 generic.go:334] "Generic (PLEG): container finished" podID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerID="8748416c0943bd9addb9b2793bd6bb2a2cf897260e351c244d04b9d0f743f4d7" exitCode=0 Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.360286 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" event={"ID":"a9b926c7-726b-4b59-b7de-0939b357dbc2","Type":"ContainerDied","Data":"8748416c0943bd9addb9b2793bd6bb2a2cf897260e351c244d04b9d0f743f4d7"} Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.362313 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gmczr_0dfb1aec-81e4-4b51-9a75-afb89d78a1fc/console/0.log" Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.362347 5008 generic.go:334] "Generic (PLEG): container finished" podID="0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" containerID="5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138" exitCode=2 Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.362367 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gmczr" event={"ID":"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc","Type":"ContainerDied","Data":"5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138"} Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.362384 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gmczr" event={"ID":"0dfb1aec-81e4-4b51-9a75-afb89d78a1fc","Type":"ContainerDied","Data":"b2970a32373ae1b6aa4b6a5006f622a892412da13640072a06f1375a4d9c80a5"} Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.362399 5008 scope.go:117] "RemoveContainer" containerID="5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138" Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.362464 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gmczr" Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.386743 5008 scope.go:117] "RemoveContainer" containerID="5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138" Mar 18 18:17:55 crc kubenswrapper[5008]: E0318 18:17:55.387143 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138\": container with ID starting with 5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138 not found: ID does not exist" containerID="5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138" Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.387255 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138"} err="failed to get container status \"5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138\": rpc error: code = NotFound desc = could not find container \"5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138\": container with ID starting with 5ddcb73aa02c51fd03cf5e8730d944f6a3370747a550036feea01bdf26bf2138 not found: ID does not exist" Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.402351 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gmczr"] Mar 18 18:17:55 crc kubenswrapper[5008]: I0318 18:17:55.406227 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-gmczr"] Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.210735 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" path="/var/lib/kubelet/pods/0dfb1aec-81e4-4b51-9a75-afb89d78a1fc/volumes" Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.680299 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.787882 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-util\") pod \"a9b926c7-726b-4b59-b7de-0939b357dbc2\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.788010 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhzt7\" (UniqueName: \"kubernetes.io/projected/a9b926c7-726b-4b59-b7de-0939b357dbc2-kube-api-access-bhzt7\") pod \"a9b926c7-726b-4b59-b7de-0939b357dbc2\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.788037 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-bundle\") pod \"a9b926c7-726b-4b59-b7de-0939b357dbc2\" (UID: \"a9b926c7-726b-4b59-b7de-0939b357dbc2\") " Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.789265 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-bundle" (OuterVolumeSpecName: "bundle") pod "a9b926c7-726b-4b59-b7de-0939b357dbc2" (UID: "a9b926c7-726b-4b59-b7de-0939b357dbc2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.792719 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b926c7-726b-4b59-b7de-0939b357dbc2-kube-api-access-bhzt7" (OuterVolumeSpecName: "kube-api-access-bhzt7") pod "a9b926c7-726b-4b59-b7de-0939b357dbc2" (UID: "a9b926c7-726b-4b59-b7de-0939b357dbc2"). InnerVolumeSpecName "kube-api-access-bhzt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.803312 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-util" (OuterVolumeSpecName: "util") pod "a9b926c7-726b-4b59-b7de-0939b357dbc2" (UID: "a9b926c7-726b-4b59-b7de-0939b357dbc2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.889081 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-util\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.889114 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhzt7\" (UniqueName: \"kubernetes.io/projected/a9b926c7-726b-4b59-b7de-0939b357dbc2-kube-api-access-bhzt7\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:56 crc kubenswrapper[5008]: I0318 18:17:56.889128 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a9b926c7-726b-4b59-b7de-0939b357dbc2-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:17:57 crc kubenswrapper[5008]: I0318 18:17:57.387884 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" event={"ID":"a9b926c7-726b-4b59-b7de-0939b357dbc2","Type":"ContainerDied","Data":"15660675efb75f20a4a0890ba79a15a0462c8c6cfcbdeb11327486639d129d24"} Mar 18 18:17:57 crc kubenswrapper[5008]: I0318 18:17:57.388299 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15660675efb75f20a4a0890ba79a15a0462c8c6cfcbdeb11327486639d129d24" Mar 18 18:17:57 crc kubenswrapper[5008]: I0318 18:17:57.387944 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.184284 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564298-gcwk6"] Mar 18 18:18:00 crc kubenswrapper[5008]: E0318 18:18:00.184489 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" containerName="console" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.184500 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" containerName="console" Mar 18 18:18:00 crc kubenswrapper[5008]: E0318 18:18:00.184516 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerName="util" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.184522 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerName="util" Mar 18 18:18:00 crc kubenswrapper[5008]: E0318 18:18:00.184533 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerName="extract" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.184539 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerName="extract" Mar 18 18:18:00 crc kubenswrapper[5008]: E0318 18:18:00.184546 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerName="pull" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.184568 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerName="pull" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.184663 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b926c7-726b-4b59-b7de-0939b357dbc2" containerName="extract" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.184676 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dfb1aec-81e4-4b51-9a75-afb89d78a1fc" containerName="console" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.185036 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-gcwk6" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.188376 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.188579 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.189061 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.206043 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-gcwk6"] Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.329228 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h9nm\" (UniqueName: \"kubernetes.io/projected/14e9e042-217a-40fe-a7c7-1d63a37cf3de-kube-api-access-4h9nm\") pod \"auto-csr-approver-29564298-gcwk6\" (UID: \"14e9e042-217a-40fe-a7c7-1d63a37cf3de\") " pod="openshift-infra/auto-csr-approver-29564298-gcwk6" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.430736 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h9nm\" (UniqueName: \"kubernetes.io/projected/14e9e042-217a-40fe-a7c7-1d63a37cf3de-kube-api-access-4h9nm\") pod \"auto-csr-approver-29564298-gcwk6\" (UID: \"14e9e042-217a-40fe-a7c7-1d63a37cf3de\") " pod="openshift-infra/auto-csr-approver-29564298-gcwk6" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.447830 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h9nm\" (UniqueName: \"kubernetes.io/projected/14e9e042-217a-40fe-a7c7-1d63a37cf3de-kube-api-access-4h9nm\") pod \"auto-csr-approver-29564298-gcwk6\" (UID: \"14e9e042-217a-40fe-a7c7-1d63a37cf3de\") " pod="openshift-infra/auto-csr-approver-29564298-gcwk6" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.498522 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-gcwk6" Mar 18 18:18:00 crc kubenswrapper[5008]: I0318 18:18:00.694299 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-gcwk6"] Mar 18 18:18:01 crc kubenswrapper[5008]: I0318 18:18:01.408108 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564298-gcwk6" event={"ID":"14e9e042-217a-40fe-a7c7-1d63a37cf3de","Type":"ContainerStarted","Data":"f420ed08619e562d0243612221c7280ed5b9855497357fc6e253771c19eae3ba"} Mar 18 18:18:02 crc kubenswrapper[5008]: I0318 18:18:02.413827 5008 generic.go:334] "Generic (PLEG): container finished" podID="14e9e042-217a-40fe-a7c7-1d63a37cf3de" containerID="0f52ebe4b101a8a1011a40ad89e5922e116ee42a53b3bcfa84275658fd556111" exitCode=0 Mar 18 18:18:02 crc kubenswrapper[5008]: I0318 18:18:02.413895 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564298-gcwk6" event={"ID":"14e9e042-217a-40fe-a7c7-1d63a37cf3de","Type":"ContainerDied","Data":"0f52ebe4b101a8a1011a40ad89e5922e116ee42a53b3bcfa84275658fd556111"} Mar 18 18:18:03 crc kubenswrapper[5008]: I0318 18:18:03.688967 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-gcwk6" Mar 18 18:18:03 crc kubenswrapper[5008]: I0318 18:18:03.773144 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h9nm\" (UniqueName: \"kubernetes.io/projected/14e9e042-217a-40fe-a7c7-1d63a37cf3de-kube-api-access-4h9nm\") pod \"14e9e042-217a-40fe-a7c7-1d63a37cf3de\" (UID: \"14e9e042-217a-40fe-a7c7-1d63a37cf3de\") " Mar 18 18:18:03 crc kubenswrapper[5008]: I0318 18:18:03.792164 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e9e042-217a-40fe-a7c7-1d63a37cf3de-kube-api-access-4h9nm" (OuterVolumeSpecName: "kube-api-access-4h9nm") pod "14e9e042-217a-40fe-a7c7-1d63a37cf3de" (UID: "14e9e042-217a-40fe-a7c7-1d63a37cf3de"). InnerVolumeSpecName "kube-api-access-4h9nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:18:03 crc kubenswrapper[5008]: I0318 18:18:03.874923 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h9nm\" (UniqueName: \"kubernetes.io/projected/14e9e042-217a-40fe-a7c7-1d63a37cf3de-kube-api-access-4h9nm\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:04 crc kubenswrapper[5008]: I0318 18:18:04.427308 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564298-gcwk6" Mar 18 18:18:04 crc kubenswrapper[5008]: I0318 18:18:04.427307 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564298-gcwk6" event={"ID":"14e9e042-217a-40fe-a7c7-1d63a37cf3de","Type":"ContainerDied","Data":"f420ed08619e562d0243612221c7280ed5b9855497357fc6e253771c19eae3ba"} Mar 18 18:18:04 crc kubenswrapper[5008]: I0318 18:18:04.427773 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f420ed08619e562d0243612221c7280ed5b9855497357fc6e253771c19eae3ba" Mar 18 18:18:04 crc kubenswrapper[5008]: I0318 18:18:04.759376 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-hwhk6"] Mar 18 18:18:04 crc kubenswrapper[5008]: I0318 18:18:04.764117 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564292-hwhk6"] Mar 18 18:18:06 crc kubenswrapper[5008]: I0318 18:18:06.212478 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a02e271-d56f-49b0-98bf-18b9ac62f364" path="/var/lib/kubelet/pods/8a02e271-d56f-49b0-98bf-18b9ac62f364/volumes" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.051679 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv"] Mar 18 18:18:08 crc kubenswrapper[5008]: E0318 18:18:08.052475 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e9e042-217a-40fe-a7c7-1d63a37cf3de" containerName="oc" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.052493 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e9e042-217a-40fe-a7c7-1d63a37cf3de" containerName="oc" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.052626 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e9e042-217a-40fe-a7c7-1d63a37cf3de" containerName="oc" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.053863 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.055861 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.055974 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rj22w" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.056815 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.059466 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.062653 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.131954 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv"] Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.132192 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd83a3e9-013e-469d-9f55-fcc1197ff19c-apiservice-cert\") pod \"metallb-operator-controller-manager-84bb8d8d7b-wj6kv\" (UID: \"fd83a3e9-013e-469d-9f55-fcc1197ff19c\") " pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.132259 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp5xt\" (UniqueName: \"kubernetes.io/projected/fd83a3e9-013e-469d-9f55-fcc1197ff19c-kube-api-access-wp5xt\") pod \"metallb-operator-controller-manager-84bb8d8d7b-wj6kv\" (UID: \"fd83a3e9-013e-469d-9f55-fcc1197ff19c\") " pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.132361 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd83a3e9-013e-469d-9f55-fcc1197ff19c-webhook-cert\") pod \"metallb-operator-controller-manager-84bb8d8d7b-wj6kv\" (UID: \"fd83a3e9-013e-469d-9f55-fcc1197ff19c\") " pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.233765 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd83a3e9-013e-469d-9f55-fcc1197ff19c-webhook-cert\") pod \"metallb-operator-controller-manager-84bb8d8d7b-wj6kv\" (UID: \"fd83a3e9-013e-469d-9f55-fcc1197ff19c\") " pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.233819 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd83a3e9-013e-469d-9f55-fcc1197ff19c-apiservice-cert\") pod \"metallb-operator-controller-manager-84bb8d8d7b-wj6kv\" (UID: \"fd83a3e9-013e-469d-9f55-fcc1197ff19c\") " pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.233856 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp5xt\" (UniqueName: \"kubernetes.io/projected/fd83a3e9-013e-469d-9f55-fcc1197ff19c-kube-api-access-wp5xt\") pod \"metallb-operator-controller-manager-84bb8d8d7b-wj6kv\" (UID: \"fd83a3e9-013e-469d-9f55-fcc1197ff19c\") " pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.241205 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd83a3e9-013e-469d-9f55-fcc1197ff19c-webhook-cert\") pod \"metallb-operator-controller-manager-84bb8d8d7b-wj6kv\" (UID: \"fd83a3e9-013e-469d-9f55-fcc1197ff19c\") " pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.241269 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd83a3e9-013e-469d-9f55-fcc1197ff19c-apiservice-cert\") pod \"metallb-operator-controller-manager-84bb8d8d7b-wj6kv\" (UID: \"fd83a3e9-013e-469d-9f55-fcc1197ff19c\") " pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.281858 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp5xt\" (UniqueName: \"kubernetes.io/projected/fd83a3e9-013e-469d-9f55-fcc1197ff19c-kube-api-access-wp5xt\") pod \"metallb-operator-controller-manager-84bb8d8d7b-wj6kv\" (UID: \"fd83a3e9-013e-469d-9f55-fcc1197ff19c\") " pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.370301 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.391261 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66dd9db899-r984g"] Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.392109 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.395532 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.395591 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-djpv6" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.403576 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66dd9db899-r984g"] Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.411122 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.536143 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8257155e-9c34-4066-99b6-9c112804ee23-apiservice-cert\") pod \"metallb-operator-webhook-server-66dd9db899-r984g\" (UID: \"8257155e-9c34-4066-99b6-9c112804ee23\") " pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.536205 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8257155e-9c34-4066-99b6-9c112804ee23-webhook-cert\") pod \"metallb-operator-webhook-server-66dd9db899-r984g\" (UID: \"8257155e-9c34-4066-99b6-9c112804ee23\") " pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.536230 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l6pv\" (UniqueName: \"kubernetes.io/projected/8257155e-9c34-4066-99b6-9c112804ee23-kube-api-access-7l6pv\") pod \"metallb-operator-webhook-server-66dd9db899-r984g\" (UID: \"8257155e-9c34-4066-99b6-9c112804ee23\") " pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.599513 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv"] Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.637791 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8257155e-9c34-4066-99b6-9c112804ee23-apiservice-cert\") pod \"metallb-operator-webhook-server-66dd9db899-r984g\" (UID: \"8257155e-9c34-4066-99b6-9c112804ee23\") " pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.637881 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8257155e-9c34-4066-99b6-9c112804ee23-webhook-cert\") pod \"metallb-operator-webhook-server-66dd9db899-r984g\" (UID: \"8257155e-9c34-4066-99b6-9c112804ee23\") " pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.637922 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l6pv\" (UniqueName: \"kubernetes.io/projected/8257155e-9c34-4066-99b6-9c112804ee23-kube-api-access-7l6pv\") pod \"metallb-operator-webhook-server-66dd9db899-r984g\" (UID: \"8257155e-9c34-4066-99b6-9c112804ee23\") " pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.642540 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8257155e-9c34-4066-99b6-9c112804ee23-apiservice-cert\") pod \"metallb-operator-webhook-server-66dd9db899-r984g\" (UID: \"8257155e-9c34-4066-99b6-9c112804ee23\") " pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.644256 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8257155e-9c34-4066-99b6-9c112804ee23-webhook-cert\") pod \"metallb-operator-webhook-server-66dd9db899-r984g\" (UID: \"8257155e-9c34-4066-99b6-9c112804ee23\") " pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.654441 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l6pv\" (UniqueName: \"kubernetes.io/projected/8257155e-9c34-4066-99b6-9c112804ee23-kube-api-access-7l6pv\") pod \"metallb-operator-webhook-server-66dd9db899-r984g\" (UID: \"8257155e-9c34-4066-99b6-9c112804ee23\") " pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.744419 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:08 crc kubenswrapper[5008]: I0318 18:18:08.928510 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66dd9db899-r984g"] Mar 18 18:18:08 crc kubenswrapper[5008]: W0318 18:18:08.943890 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8257155e_9c34_4066_99b6_9c112804ee23.slice/crio-ba4f863f434b308237402fd58c174b2b46241d9c8e0946a65b01a59b748161a9 WatchSource:0}: Error finding container ba4f863f434b308237402fd58c174b2b46241d9c8e0946a65b01a59b748161a9: Status 404 returned error can't find the container with id ba4f863f434b308237402fd58c174b2b46241d9c8e0946a65b01a59b748161a9 Mar 18 18:18:09 crc kubenswrapper[5008]: I0318 18:18:09.460738 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" event={"ID":"8257155e-9c34-4066-99b6-9c112804ee23","Type":"ContainerStarted","Data":"ba4f863f434b308237402fd58c174b2b46241d9c8e0946a65b01a59b748161a9"} Mar 18 18:18:09 crc kubenswrapper[5008]: I0318 18:18:09.462760 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" event={"ID":"fd83a3e9-013e-469d-9f55-fcc1197ff19c","Type":"ContainerStarted","Data":"079c1af858bf4d6299b7b96d711b26c8133b3f3bbddedc21af08e1c67805fe6f"} Mar 18 18:18:14 crc kubenswrapper[5008]: I0318 18:18:14.490723 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" event={"ID":"8257155e-9c34-4066-99b6-9c112804ee23","Type":"ContainerStarted","Data":"00ddf7b47c0e56ab8f2bc629a94fc39670fb6277e4f287a5fd0ff07ed21e6169"} Mar 18 18:18:14 crc kubenswrapper[5008]: I0318 18:18:14.491432 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:14 crc kubenswrapper[5008]: I0318 18:18:14.492347 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" event={"ID":"fd83a3e9-013e-469d-9f55-fcc1197ff19c","Type":"ContainerStarted","Data":"90d51da2b63542495e660b876fc77010082e6533881e21d800d72e04cb3b275f"} Mar 18 18:18:14 crc kubenswrapper[5008]: I0318 18:18:14.492653 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:14 crc kubenswrapper[5008]: I0318 18:18:14.510246 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" podStartSLOduration=1.46861859 podStartE2EDuration="6.510224596s" podCreationTimestamp="2026-03-18 18:18:08 +0000 UTC" firstStartedPulling="2026-03-18 18:18:08.947816663 +0000 UTC m=+945.467289742" lastFinishedPulling="2026-03-18 18:18:13.989422669 +0000 UTC m=+950.508895748" observedRunningTime="2026-03-18 18:18:14.510107353 +0000 UTC m=+951.029580432" watchObservedRunningTime="2026-03-18 18:18:14.510224596 +0000 UTC m=+951.029697715" Mar 18 18:18:14 crc kubenswrapper[5008]: I0318 18:18:14.535447 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" podStartSLOduration=1.169834679 podStartE2EDuration="6.535424727s" podCreationTimestamp="2026-03-18 18:18:08 +0000 UTC" firstStartedPulling="2026-03-18 18:18:08.610180877 +0000 UTC m=+945.129653946" lastFinishedPulling="2026-03-18 18:18:13.975770915 +0000 UTC m=+950.495243994" observedRunningTime="2026-03-18 18:18:14.531516963 +0000 UTC m=+951.050990052" watchObservedRunningTime="2026-03-18 18:18:14.535424727 +0000 UTC m=+951.054897806" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.732799 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qrd"] Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.734926 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.752518 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qrd"] Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.827466 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc9g7\" (UniqueName: \"kubernetes.io/projected/cd569db3-34b9-4c51-8188-d6f8d4e5056b-kube-api-access-jc9g7\") pod \"redhat-marketplace-r5qrd\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.827625 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-catalog-content\") pod \"redhat-marketplace-r5qrd\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.827783 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-utilities\") pod \"redhat-marketplace-r5qrd\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.929818 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-catalog-content\") pod \"redhat-marketplace-r5qrd\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.929925 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-utilities\") pod \"redhat-marketplace-r5qrd\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.929957 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc9g7\" (UniqueName: \"kubernetes.io/projected/cd569db3-34b9-4c51-8188-d6f8d4e5056b-kube-api-access-jc9g7\") pod \"redhat-marketplace-r5qrd\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.930679 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-catalog-content\") pod \"redhat-marketplace-r5qrd\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.930731 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-utilities\") pod \"redhat-marketplace-r5qrd\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:22 crc kubenswrapper[5008]: I0318 18:18:22.958396 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc9g7\" (UniqueName: \"kubernetes.io/projected/cd569db3-34b9-4c51-8188-d6f8d4e5056b-kube-api-access-jc9g7\") pod \"redhat-marketplace-r5qrd\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:23 crc kubenswrapper[5008]: I0318 18:18:23.064848 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:23 crc kubenswrapper[5008]: I0318 18:18:23.285006 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qrd"] Mar 18 18:18:23 crc kubenswrapper[5008]: I0318 18:18:23.553855 5008 generic.go:334] "Generic (PLEG): container finished" podID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerID="6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59" exitCode=0 Mar 18 18:18:23 crc kubenswrapper[5008]: I0318 18:18:23.553909 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qrd" event={"ID":"cd569db3-34b9-4c51-8188-d6f8d4e5056b","Type":"ContainerDied","Data":"6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59"} Mar 18 18:18:23 crc kubenswrapper[5008]: I0318 18:18:23.554269 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qrd" event={"ID":"cd569db3-34b9-4c51-8188-d6f8d4e5056b","Type":"ContainerStarted","Data":"7f63bcabb4b73274150e866253e1d40cdb1dc3f0d13a6131e9a58d9b3a4a0ab6"} Mar 18 18:18:24 crc kubenswrapper[5008]: I0318 18:18:24.460957 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:18:24 crc kubenswrapper[5008]: I0318 18:18:24.461452 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:18:24 crc kubenswrapper[5008]: I0318 18:18:24.563034 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qrd" event={"ID":"cd569db3-34b9-4c51-8188-d6f8d4e5056b","Type":"ContainerStarted","Data":"09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c"} Mar 18 18:18:25 crc kubenswrapper[5008]: I0318 18:18:25.572905 5008 generic.go:334] "Generic (PLEG): container finished" podID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerID="09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c" exitCode=0 Mar 18 18:18:25 crc kubenswrapper[5008]: I0318 18:18:25.573062 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qrd" event={"ID":"cd569db3-34b9-4c51-8188-d6f8d4e5056b","Type":"ContainerDied","Data":"09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c"} Mar 18 18:18:27 crc kubenswrapper[5008]: I0318 18:18:27.588470 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qrd" event={"ID":"cd569db3-34b9-4c51-8188-d6f8d4e5056b","Type":"ContainerStarted","Data":"95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321"} Mar 18 18:18:27 crc kubenswrapper[5008]: I0318 18:18:27.607893 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r5qrd" podStartSLOduration=1.735278203 podStartE2EDuration="5.60787738s" podCreationTimestamp="2026-03-18 18:18:22 +0000 UTC" firstStartedPulling="2026-03-18 18:18:23.555534354 +0000 UTC m=+960.075007443" lastFinishedPulling="2026-03-18 18:18:27.428133541 +0000 UTC m=+963.947606620" observedRunningTime="2026-03-18 18:18:27.606113173 +0000 UTC m=+964.125586252" watchObservedRunningTime="2026-03-18 18:18:27.60787738 +0000 UTC m=+964.127350459" Mar 18 18:18:28 crc kubenswrapper[5008]: I0318 18:18:28.750516 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66dd9db899-r984g" Mar 18 18:18:33 crc kubenswrapper[5008]: I0318 18:18:33.065411 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:33 crc kubenswrapper[5008]: I0318 18:18:33.066222 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:33 crc kubenswrapper[5008]: I0318 18:18:33.133490 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:33 crc kubenswrapper[5008]: I0318 18:18:33.665636 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:35 crc kubenswrapper[5008]: I0318 18:18:35.523618 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qrd"] Mar 18 18:18:35 crc kubenswrapper[5008]: I0318 18:18:35.641439 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r5qrd" podUID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerName="registry-server" containerID="cri-o://95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321" gracePeriod=2 Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.099602 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.301499 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-utilities\") pod \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.301583 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc9g7\" (UniqueName: \"kubernetes.io/projected/cd569db3-34b9-4c51-8188-d6f8d4e5056b-kube-api-access-jc9g7\") pod \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.301650 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-catalog-content\") pod \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\" (UID: \"cd569db3-34b9-4c51-8188-d6f8d4e5056b\") " Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.302973 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-utilities" (OuterVolumeSpecName: "utilities") pod "cd569db3-34b9-4c51-8188-d6f8d4e5056b" (UID: "cd569db3-34b9-4c51-8188-d6f8d4e5056b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.316804 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd569db3-34b9-4c51-8188-d6f8d4e5056b-kube-api-access-jc9g7" (OuterVolumeSpecName: "kube-api-access-jc9g7") pod "cd569db3-34b9-4c51-8188-d6f8d4e5056b" (UID: "cd569db3-34b9-4c51-8188-d6f8d4e5056b"). InnerVolumeSpecName "kube-api-access-jc9g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.352199 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd569db3-34b9-4c51-8188-d6f8d4e5056b" (UID: "cd569db3-34b9-4c51-8188-d6f8d4e5056b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.403082 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.403116 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc9g7\" (UniqueName: \"kubernetes.io/projected/cd569db3-34b9-4c51-8188-d6f8d4e5056b-kube-api-access-jc9g7\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.403128 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd569db3-34b9-4c51-8188-d6f8d4e5056b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.494483 5008 scope.go:117] "RemoveContainer" containerID="68d1c1a92676bebaf4b365e611b18c4eb02dea01c14193a91b94bf2cb447f2ae" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.648702 5008 generic.go:334] "Generic (PLEG): container finished" podID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerID="95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321" exitCode=0 Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.648767 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qrd" event={"ID":"cd569db3-34b9-4c51-8188-d6f8d4e5056b","Type":"ContainerDied","Data":"95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321"} Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.648780 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5qrd" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.648818 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qrd" event={"ID":"cd569db3-34b9-4c51-8188-d6f8d4e5056b","Type":"ContainerDied","Data":"7f63bcabb4b73274150e866253e1d40cdb1dc3f0d13a6131e9a58d9b3a4a0ab6"} Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.648850 5008 scope.go:117] "RemoveContainer" containerID="95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.672390 5008 scope.go:117] "RemoveContainer" containerID="09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.693702 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qrd"] Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.695372 5008 scope.go:117] "RemoveContainer" containerID="6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.701377 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qrd"] Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.711815 5008 scope.go:117] "RemoveContainer" containerID="95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321" Mar 18 18:18:36 crc kubenswrapper[5008]: E0318 18:18:36.712186 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321\": container with ID starting with 95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321 not found: ID does not exist" containerID="95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.712237 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321"} err="failed to get container status \"95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321\": rpc error: code = NotFound desc = could not find container \"95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321\": container with ID starting with 95b6b5bc125fca99df71eaf4e6f1c609103dfad0dbcc5dd72325be908aad2321 not found: ID does not exist" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.712271 5008 scope.go:117] "RemoveContainer" containerID="09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c" Mar 18 18:18:36 crc kubenswrapper[5008]: E0318 18:18:36.712544 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c\": container with ID starting with 09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c not found: ID does not exist" containerID="09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.712579 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c"} err="failed to get container status \"09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c\": rpc error: code = NotFound desc = could not find container \"09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c\": container with ID starting with 09bc75df3f127e640b62d70b41f8ba6c1d689abd0bf4d07de296487a8570b19c not found: ID does not exist" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.712620 5008 scope.go:117] "RemoveContainer" containerID="6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59" Mar 18 18:18:36 crc kubenswrapper[5008]: E0318 18:18:36.712900 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59\": container with ID starting with 6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59 not found: ID does not exist" containerID="6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59" Mar 18 18:18:36 crc kubenswrapper[5008]: I0318 18:18:36.712942 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59"} err="failed to get container status \"6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59\": rpc error: code = NotFound desc = could not find container \"6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59\": container with ID starting with 6067e97e799667147b463f645d59d4c8bf66315f2481125be66358b7ac1c6b59 not found: ID does not exist" Mar 18 18:18:38 crc kubenswrapper[5008]: I0318 18:18:38.206789 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" path="/var/lib/kubelet/pods/cd569db3-34b9-4c51-8188-d6f8d4e5056b/volumes" Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.943632 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-td56z"] Mar 18 18:18:40 crc kubenswrapper[5008]: E0318 18:18:40.944042 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerName="registry-server" Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.944068 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerName="registry-server" Mar 18 18:18:40 crc kubenswrapper[5008]: E0318 18:18:40.944099 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerName="extract-utilities" Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.944114 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerName="extract-utilities" Mar 18 18:18:40 crc kubenswrapper[5008]: E0318 18:18:40.944146 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerName="extract-content" Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.944164 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerName="extract-content" Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.944402 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd569db3-34b9-4c51-8188-d6f8d4e5056b" containerName="registry-server" Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.946475 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.957364 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-td56z"] Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.962547 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-catalog-content\") pod \"certified-operators-td56z\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.963663 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fkps\" (UniqueName: \"kubernetes.io/projected/13f63bef-fe93-4f7e-80f1-ebf4cc703378-kube-api-access-5fkps\") pod \"certified-operators-td56z\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:40 crc kubenswrapper[5008]: I0318 18:18:40.963797 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-utilities\") pod \"certified-operators-td56z\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.065716 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-catalog-content\") pod \"certified-operators-td56z\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.065830 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fkps\" (UniqueName: \"kubernetes.io/projected/13f63bef-fe93-4f7e-80f1-ebf4cc703378-kube-api-access-5fkps\") pod \"certified-operators-td56z\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.065898 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-utilities\") pod \"certified-operators-td56z\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.066210 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-catalog-content\") pod \"certified-operators-td56z\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.066682 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-utilities\") pod \"certified-operators-td56z\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.095946 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fkps\" (UniqueName: \"kubernetes.io/projected/13f63bef-fe93-4f7e-80f1-ebf4cc703378-kube-api-access-5fkps\") pod \"certified-operators-td56z\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.270609 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.497755 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-td56z"] Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.680011 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td56z" event={"ID":"13f63bef-fe93-4f7e-80f1-ebf4cc703378","Type":"ContainerStarted","Data":"d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51"} Mar 18 18:18:41 crc kubenswrapper[5008]: I0318 18:18:41.680052 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td56z" event={"ID":"13f63bef-fe93-4f7e-80f1-ebf4cc703378","Type":"ContainerStarted","Data":"5709d508b56a71f79bccc6a569860815a172e073bd0619df8b950a5395e55da7"} Mar 18 18:18:42 crc kubenswrapper[5008]: I0318 18:18:42.692268 5008 generic.go:334] "Generic (PLEG): container finished" podID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerID="d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51" exitCode=0 Mar 18 18:18:42 crc kubenswrapper[5008]: I0318 18:18:42.692338 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td56z" event={"ID":"13f63bef-fe93-4f7e-80f1-ebf4cc703378","Type":"ContainerDied","Data":"d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51"} Mar 18 18:18:42 crc kubenswrapper[5008]: I0318 18:18:42.692911 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td56z" event={"ID":"13f63bef-fe93-4f7e-80f1-ebf4cc703378","Type":"ContainerStarted","Data":"f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419"} Mar 18 18:18:43 crc kubenswrapper[5008]: I0318 18:18:43.701408 5008 generic.go:334] "Generic (PLEG): container finished" podID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerID="f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419" exitCode=0 Mar 18 18:18:43 crc kubenswrapper[5008]: I0318 18:18:43.701480 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td56z" event={"ID":"13f63bef-fe93-4f7e-80f1-ebf4cc703378","Type":"ContainerDied","Data":"f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419"} Mar 18 18:18:43 crc kubenswrapper[5008]: I0318 18:18:43.701516 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td56z" event={"ID":"13f63bef-fe93-4f7e-80f1-ebf4cc703378","Type":"ContainerStarted","Data":"d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167"} Mar 18 18:18:43 crc kubenswrapper[5008]: I0318 18:18:43.725876 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-td56z" podStartSLOduration=2.273335779 podStartE2EDuration="3.725858943s" podCreationTimestamp="2026-03-18 18:18:40 +0000 UTC" firstStartedPulling="2026-03-18 18:18:41.68152449 +0000 UTC m=+978.200997569" lastFinishedPulling="2026-03-18 18:18:43.134047634 +0000 UTC m=+979.653520733" observedRunningTime="2026-03-18 18:18:43.722635057 +0000 UTC m=+980.242108156" watchObservedRunningTime="2026-03-18 18:18:43.725858943 +0000 UTC m=+980.245332022" Mar 18 18:18:48 crc kubenswrapper[5008]: I0318 18:18:48.372699 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84bb8d8d7b-wj6kv" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.131853 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-mnvl6"] Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.133928 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.135854 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qkz5t" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.135940 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.136361 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.152204 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq"] Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.153080 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.154393 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.168113 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq"] Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.226668 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-pdjkn"] Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.227604 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.229414 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.229525 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wcx6g" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.235967 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.235998 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.241775 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-q5drs"] Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.242897 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.246753 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-q5drs"] Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.289709 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.290079 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/32700067-407c-41a0-8d49-835fd75bb28d-frr-startup\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.290425 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxw9g\" (UniqueName: \"kubernetes.io/projected/32700067-407c-41a0-8d49-835fd75bb28d-kube-api-access-vxw9g\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.290489 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-metrics\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.290537 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkbgv\" (UniqueName: \"kubernetes.io/projected/b7ac363f-dcd1-43df-a280-45ded08e9446-kube-api-access-gkbgv\") pod \"frr-k8s-webhook-server-bcc4b6f68-5ctcq\" (UID: \"b7ac363f-dcd1-43df-a280-45ded08e9446\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.290612 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ac363f-dcd1-43df-a280-45ded08e9446-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5ctcq\" (UID: \"b7ac363f-dcd1-43df-a280-45ded08e9446\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.290654 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-frr-sockets\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.290743 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32700067-407c-41a0-8d49-835fd75bb28d-metrics-certs\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.290784 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-frr-conf\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.290885 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-reloader\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392012 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54v6\" (UniqueName: \"kubernetes.io/projected/44eda89c-4b47-46f6-a60f-a439d8721b2f-kube-api-access-j54v6\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392061 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32700067-407c-41a0-8d49-835fd75bb28d-metrics-certs\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392080 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-frr-conf\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392102 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-memberlist\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392124 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-reloader\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392141 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/652e9f56-c4d2-493b-bc68-11fd5cff1657-metrics-certs\") pod \"controller-7bb4cc7c98-q5drs\" (UID: \"652e9f56-c4d2-493b-bc68-11fd5cff1657\") " pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392172 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/44eda89c-4b47-46f6-a60f-a439d8721b2f-metallb-excludel2\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392205 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/32700067-407c-41a0-8d49-835fd75bb28d-frr-startup\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392220 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w62nf\" (UniqueName: \"kubernetes.io/projected/652e9f56-c4d2-493b-bc68-11fd5cff1657-kube-api-access-w62nf\") pod \"controller-7bb4cc7c98-q5drs\" (UID: \"652e9f56-c4d2-493b-bc68-11fd5cff1657\") " pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392236 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxw9g\" (UniqueName: \"kubernetes.io/projected/32700067-407c-41a0-8d49-835fd75bb28d-kube-api-access-vxw9g\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392252 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-metrics-certs\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392267 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-metrics\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392283 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkbgv\" (UniqueName: \"kubernetes.io/projected/b7ac363f-dcd1-43df-a280-45ded08e9446-kube-api-access-gkbgv\") pod \"frr-k8s-webhook-server-bcc4b6f68-5ctcq\" (UID: \"b7ac363f-dcd1-43df-a280-45ded08e9446\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392298 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/652e9f56-c4d2-493b-bc68-11fd5cff1657-cert\") pod \"controller-7bb4cc7c98-q5drs\" (UID: \"652e9f56-c4d2-493b-bc68-11fd5cff1657\") " pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392314 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ac363f-dcd1-43df-a280-45ded08e9446-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5ctcq\" (UID: \"b7ac363f-dcd1-43df-a280-45ded08e9446\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392330 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-frr-sockets\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.392698 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-frr-sockets\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.393477 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-frr-conf\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.394137 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-metrics\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.394179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/32700067-407c-41a0-8d49-835fd75bb28d-reloader\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.394599 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/32700067-407c-41a0-8d49-835fd75bb28d-frr-startup\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.409369 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32700067-407c-41a0-8d49-835fd75bb28d-metrics-certs\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.409451 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7ac363f-dcd1-43df-a280-45ded08e9446-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5ctcq\" (UID: \"b7ac363f-dcd1-43df-a280-45ded08e9446\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.416302 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkbgv\" (UniqueName: \"kubernetes.io/projected/b7ac363f-dcd1-43df-a280-45ded08e9446-kube-api-access-gkbgv\") pod \"frr-k8s-webhook-server-bcc4b6f68-5ctcq\" (UID: \"b7ac363f-dcd1-43df-a280-45ded08e9446\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.424419 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxw9g\" (UniqueName: \"kubernetes.io/projected/32700067-407c-41a0-8d49-835fd75bb28d-kube-api-access-vxw9g\") pod \"frr-k8s-mnvl6\" (UID: \"32700067-407c-41a0-8d49-835fd75bb28d\") " pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.451441 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.471124 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.494427 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/652e9f56-c4d2-493b-bc68-11fd5cff1657-metrics-certs\") pod \"controller-7bb4cc7c98-q5drs\" (UID: \"652e9f56-c4d2-493b-bc68-11fd5cff1657\") " pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.494507 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/44eda89c-4b47-46f6-a60f-a439d8721b2f-metallb-excludel2\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.494547 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w62nf\" (UniqueName: \"kubernetes.io/projected/652e9f56-c4d2-493b-bc68-11fd5cff1657-kube-api-access-w62nf\") pod \"controller-7bb4cc7c98-q5drs\" (UID: \"652e9f56-c4d2-493b-bc68-11fd5cff1657\") " pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.494590 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-metrics-certs\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.494622 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/652e9f56-c4d2-493b-bc68-11fd5cff1657-cert\") pod \"controller-7bb4cc7c98-q5drs\" (UID: \"652e9f56-c4d2-493b-bc68-11fd5cff1657\") " pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.494666 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54v6\" (UniqueName: \"kubernetes.io/projected/44eda89c-4b47-46f6-a60f-a439d8721b2f-kube-api-access-j54v6\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.494700 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-memberlist\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: E0318 18:18:49.494799 5008 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 18:18:49 crc kubenswrapper[5008]: E0318 18:18:49.494862 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-memberlist podName:44eda89c-4b47-46f6-a60f-a439d8721b2f nodeName:}" failed. No retries permitted until 2026-03-18 18:18:49.99484165 +0000 UTC m=+986.514314729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-memberlist") pod "speaker-pdjkn" (UID: "44eda89c-4b47-46f6-a60f-a439d8721b2f") : secret "metallb-memberlist" not found Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.495188 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/44eda89c-4b47-46f6-a60f-a439d8721b2f-metallb-excludel2\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.496704 5008 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.501225 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-metrics-certs\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.502146 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/652e9f56-c4d2-493b-bc68-11fd5cff1657-metrics-certs\") pod \"controller-7bb4cc7c98-q5drs\" (UID: \"652e9f56-c4d2-493b-bc68-11fd5cff1657\") " pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.510251 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/652e9f56-c4d2-493b-bc68-11fd5cff1657-cert\") pod \"controller-7bb4cc7c98-q5drs\" (UID: \"652e9f56-c4d2-493b-bc68-11fd5cff1657\") " pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.510997 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w62nf\" (UniqueName: \"kubernetes.io/projected/652e9f56-c4d2-493b-bc68-11fd5cff1657-kube-api-access-w62nf\") pod \"controller-7bb4cc7c98-q5drs\" (UID: \"652e9f56-c4d2-493b-bc68-11fd5cff1657\") " pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.512783 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54v6\" (UniqueName: \"kubernetes.io/projected/44eda89c-4b47-46f6-a60f-a439d8721b2f-kube-api-access-j54v6\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.612824 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.746856 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerStarted","Data":"be60e25fbbf02e1ef9559237803d95942811e81eaf2400411149649e903d1223"} Mar 18 18:18:49 crc kubenswrapper[5008]: I0318 18:18:49.871577 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq"] Mar 18 18:18:49 crc kubenswrapper[5008]: W0318 18:18:49.877331 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7ac363f_dcd1_43df_a280_45ded08e9446.slice/crio-e24f26dff030bb0ece714bd0e8a7b005015a0ffd46596e057761ef7bb78fa1b1 WatchSource:0}: Error finding container e24f26dff030bb0ece714bd0e8a7b005015a0ffd46596e057761ef7bb78fa1b1: Status 404 returned error can't find the container with id e24f26dff030bb0ece714bd0e8a7b005015a0ffd46596e057761ef7bb78fa1b1 Mar 18 18:18:50 crc kubenswrapper[5008]: I0318 18:18:50.000103 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-memberlist\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:50 crc kubenswrapper[5008]: E0318 18:18:50.000251 5008 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 18:18:50 crc kubenswrapper[5008]: E0318 18:18:50.000306 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-memberlist podName:44eda89c-4b47-46f6-a60f-a439d8721b2f nodeName:}" failed. No retries permitted until 2026-03-18 18:18:51.000290918 +0000 UTC m=+987.519764007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-memberlist") pod "speaker-pdjkn" (UID: "44eda89c-4b47-46f6-a60f-a439d8721b2f") : secret "metallb-memberlist" not found Mar 18 18:18:50 crc kubenswrapper[5008]: I0318 18:18:50.012493 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-q5drs"] Mar 18 18:18:50 crc kubenswrapper[5008]: W0318 18:18:50.019369 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652e9f56_c4d2_493b_bc68_11fd5cff1657.slice/crio-6c37c48baa1666a4c9be06e90ab37ca3c389c694c3231a4af61d33ca540a24d5 WatchSource:0}: Error finding container 6c37c48baa1666a4c9be06e90ab37ca3c389c694c3231a4af61d33ca540a24d5: Status 404 returned error can't find the container with id 6c37c48baa1666a4c9be06e90ab37ca3c389c694c3231a4af61d33ca540a24d5 Mar 18 18:18:50 crc kubenswrapper[5008]: I0318 18:18:50.756329 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-q5drs" event={"ID":"652e9f56-c4d2-493b-bc68-11fd5cff1657","Type":"ContainerStarted","Data":"f2aecb8d7002016e747636bf254415cbe06b41bab21b729daed7b00be0fd5798"} Mar 18 18:18:50 crc kubenswrapper[5008]: I0318 18:18:50.756734 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:18:50 crc kubenswrapper[5008]: I0318 18:18:50.756750 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-q5drs" event={"ID":"652e9f56-c4d2-493b-bc68-11fd5cff1657","Type":"ContainerStarted","Data":"8d6a131d0aebd9a14051d167c2af246c3f30dc35707129da1177e954b66edec8"} Mar 18 18:18:50 crc kubenswrapper[5008]: I0318 18:18:50.756762 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-q5drs" event={"ID":"652e9f56-c4d2-493b-bc68-11fd5cff1657","Type":"ContainerStarted","Data":"6c37c48baa1666a4c9be06e90ab37ca3c389c694c3231a4af61d33ca540a24d5"} Mar 18 18:18:50 crc kubenswrapper[5008]: I0318 18:18:50.758385 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" event={"ID":"b7ac363f-dcd1-43df-a280-45ded08e9446","Type":"ContainerStarted","Data":"e24f26dff030bb0ece714bd0e8a7b005015a0ffd46596e057761ef7bb78fa1b1"} Mar 18 18:18:50 crc kubenswrapper[5008]: I0318 18:18:50.788205 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-q5drs" podStartSLOduration=1.788181882 podStartE2EDuration="1.788181882s" podCreationTimestamp="2026-03-18 18:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:18:50.778232167 +0000 UTC m=+987.297705256" watchObservedRunningTime="2026-03-18 18:18:50.788181882 +0000 UTC m=+987.307654981" Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.013300 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-memberlist\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.020819 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/44eda89c-4b47-46f6-a60f-a439d8721b2f-memberlist\") pod \"speaker-pdjkn\" (UID: \"44eda89c-4b47-46f6-a60f-a439d8721b2f\") " pod="metallb-system/speaker-pdjkn" Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.042920 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pdjkn" Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.274439 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.274768 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.337210 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.768036 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pdjkn" event={"ID":"44eda89c-4b47-46f6-a60f-a439d8721b2f","Type":"ContainerStarted","Data":"5f1788a456b7642d95d6e57be6b2849b1e0693a0aa41bee9f30f98c84c66ba6a"} Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.768427 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pdjkn" event={"ID":"44eda89c-4b47-46f6-a60f-a439d8721b2f","Type":"ContainerStarted","Data":"5acb960e4b2967da89736d8bde8e692f0356fda444ba4dfd5f0806c56d088de0"} Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.768474 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pdjkn" event={"ID":"44eda89c-4b47-46f6-a60f-a439d8721b2f","Type":"ContainerStarted","Data":"c0aeeb123408c68e8c251567c17311c1fd7c81eb51fb6755fe1ca2f75d66ebad"} Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.768622 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-pdjkn" Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.786828 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-pdjkn" podStartSLOduration=2.786809841 podStartE2EDuration="2.786809841s" podCreationTimestamp="2026-03-18 18:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:18:51.785204379 +0000 UTC m=+988.304677458" watchObservedRunningTime="2026-03-18 18:18:51.786809841 +0000 UTC m=+988.306282920" Mar 18 18:18:51 crc kubenswrapper[5008]: I0318 18:18:51.812141 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.123674 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-td56z"] Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.124460 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-td56z" podUID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerName="registry-server" containerID="cri-o://d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167" gracePeriod=2 Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.460192 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.460276 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.460383 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.460963 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f98223e188c7e180bb9c16b9b888a18eaae99967d91bf2ff048b12e80fd84a1c"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.461012 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://f98223e188c7e180bb9c16b9b888a18eaae99967d91bf2ff048b12e80fd84a1c" gracePeriod=600 Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.524504 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.666007 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-catalog-content\") pod \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.666144 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-utilities\") pod \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.666166 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fkps\" (UniqueName: \"kubernetes.io/projected/13f63bef-fe93-4f7e-80f1-ebf4cc703378-kube-api-access-5fkps\") pod \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\" (UID: \"13f63bef-fe93-4f7e-80f1-ebf4cc703378\") " Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.667231 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-utilities" (OuterVolumeSpecName: "utilities") pod "13f63bef-fe93-4f7e-80f1-ebf4cc703378" (UID: "13f63bef-fe93-4f7e-80f1-ebf4cc703378"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.673717 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f63bef-fe93-4f7e-80f1-ebf4cc703378-kube-api-access-5fkps" (OuterVolumeSpecName: "kube-api-access-5fkps") pod "13f63bef-fe93-4f7e-80f1-ebf4cc703378" (UID: "13f63bef-fe93-4f7e-80f1-ebf4cc703378"). InnerVolumeSpecName "kube-api-access-5fkps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.714505 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13f63bef-fe93-4f7e-80f1-ebf4cc703378" (UID: "13f63bef-fe93-4f7e-80f1-ebf4cc703378"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.768271 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fkps\" (UniqueName: \"kubernetes.io/projected/13f63bef-fe93-4f7e-80f1-ebf4cc703378-kube-api-access-5fkps\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.768308 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.768320 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13f63bef-fe93-4f7e-80f1-ebf4cc703378-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.789896 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="f98223e188c7e180bb9c16b9b888a18eaae99967d91bf2ff048b12e80fd84a1c" exitCode=0 Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.789959 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"f98223e188c7e180bb9c16b9b888a18eaae99967d91bf2ff048b12e80fd84a1c"} Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.789988 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"fd022c5c3ebfc1487f31b5991ba1ecc58d2f77c9bf3db917b976667648d3cca3"} Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.790005 5008 scope.go:117] "RemoveContainer" containerID="214d6c92536a599ebc37d353c0d760916c05e754a965960c9c93d0c42cd15af6" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.793600 5008 generic.go:334] "Generic (PLEG): container finished" podID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerID="d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167" exitCode=0 Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.793627 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td56z" event={"ID":"13f63bef-fe93-4f7e-80f1-ebf4cc703378","Type":"ContainerDied","Data":"d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167"} Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.793649 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td56z" event={"ID":"13f63bef-fe93-4f7e-80f1-ebf4cc703378","Type":"ContainerDied","Data":"5709d508b56a71f79bccc6a569860815a172e073bd0619df8b950a5395e55da7"} Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.793676 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-td56z" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.819253 5008 scope.go:117] "RemoveContainer" containerID="d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.820518 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-td56z"] Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.824700 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-td56z"] Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.837642 5008 scope.go:117] "RemoveContainer" containerID="f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.851329 5008 scope.go:117] "RemoveContainer" containerID="d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.865004 5008 scope.go:117] "RemoveContainer" containerID="d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167" Mar 18 18:18:54 crc kubenswrapper[5008]: E0318 18:18:54.865407 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167\": container with ID starting with d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167 not found: ID does not exist" containerID="d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.865459 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167"} err="failed to get container status \"d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167\": rpc error: code = NotFound desc = could not find container \"d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167\": container with ID starting with d175ffc7e8d350c85943e670db20f0f0074516e569563f3d9712ed1abe81a167 not found: ID does not exist" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.865494 5008 scope.go:117] "RemoveContainer" containerID="f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419" Mar 18 18:18:54 crc kubenswrapper[5008]: E0318 18:18:54.865866 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419\": container with ID starting with f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419 not found: ID does not exist" containerID="f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.865901 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419"} err="failed to get container status \"f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419\": rpc error: code = NotFound desc = could not find container \"f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419\": container with ID starting with f48b19a72ad7d9f78dce8e1638eeb92d1ecc321f4e0cb70eae3f79def9d93419 not found: ID does not exist" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.865927 5008 scope.go:117] "RemoveContainer" containerID="d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51" Mar 18 18:18:54 crc kubenswrapper[5008]: E0318 18:18:54.866428 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51\": container with ID starting with d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51 not found: ID does not exist" containerID="d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51" Mar 18 18:18:54 crc kubenswrapper[5008]: I0318 18:18:54.866455 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51"} err="failed to get container status \"d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51\": rpc error: code = NotFound desc = could not find container \"d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51\": container with ID starting with d0ad2623d32652ab38abdafa599333b0bef6436a9bb3c1a07352abb47d840b51 not found: ID does not exist" Mar 18 18:18:56 crc kubenswrapper[5008]: I0318 18:18:56.220321 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" path="/var/lib/kubelet/pods/13f63bef-fe93-4f7e-80f1-ebf4cc703378/volumes" Mar 18 18:18:57 crc kubenswrapper[5008]: I0318 18:18:57.823373 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" event={"ID":"b7ac363f-dcd1-43df-a280-45ded08e9446","Type":"ContainerStarted","Data":"1427720584bb672eed0f7ed2fb683681b904b7a30b4e0da2d119af9375ec58a4"} Mar 18 18:18:57 crc kubenswrapper[5008]: I0318 18:18:57.823752 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:18:57 crc kubenswrapper[5008]: I0318 18:18:57.825654 5008 generic.go:334] "Generic (PLEG): container finished" podID="32700067-407c-41a0-8d49-835fd75bb28d" containerID="f5ec9d40b3041fdfc79dcbe0ef494ab063b6ab2f514c2a96abc31f8fdcbafe97" exitCode=0 Mar 18 18:18:57 crc kubenswrapper[5008]: I0318 18:18:57.825697 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerDied","Data":"f5ec9d40b3041fdfc79dcbe0ef494ab063b6ab2f514c2a96abc31f8fdcbafe97"} Mar 18 18:18:57 crc kubenswrapper[5008]: I0318 18:18:57.851407 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" podStartSLOduration=1.36550389 podStartE2EDuration="8.851382075s" podCreationTimestamp="2026-03-18 18:18:49 +0000 UTC" firstStartedPulling="2026-03-18 18:18:49.879583522 +0000 UTC m=+986.399056601" lastFinishedPulling="2026-03-18 18:18:57.365461697 +0000 UTC m=+993.884934786" observedRunningTime="2026-03-18 18:18:57.848404985 +0000 UTC m=+994.367878064" watchObservedRunningTime="2026-03-18 18:18:57.851382075 +0000 UTC m=+994.370855164" Mar 18 18:18:58 crc kubenswrapper[5008]: I0318 18:18:58.834242 5008 generic.go:334] "Generic (PLEG): container finished" podID="32700067-407c-41a0-8d49-835fd75bb28d" containerID="86d6d172898e9a0a43c5ec5558c8bbcc261639b00ec0a7ed47597c65ed1596b1" exitCode=0 Mar 18 18:18:58 crc kubenswrapper[5008]: I0318 18:18:58.834300 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerDied","Data":"86d6d172898e9a0a43c5ec5558c8bbcc261639b00ec0a7ed47597c65ed1596b1"} Mar 18 18:18:59 crc kubenswrapper[5008]: I0318 18:18:59.843620 5008 generic.go:334] "Generic (PLEG): container finished" podID="32700067-407c-41a0-8d49-835fd75bb28d" containerID="ec68f3ff990f17beaac2418887d055577e48aadc44c7df441e8c2a8ed403b9a1" exitCode=0 Mar 18 18:18:59 crc kubenswrapper[5008]: I0318 18:18:59.843866 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerDied","Data":"ec68f3ff990f17beaac2418887d055577e48aadc44c7df441e8c2a8ed403b9a1"} Mar 18 18:19:00 crc kubenswrapper[5008]: I0318 18:19:00.853082 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerStarted","Data":"6d975fe61a0d4d682adda934cc7a21394ead9cc1cbdea22aae7496e09038d5c9"} Mar 18 18:19:00 crc kubenswrapper[5008]: I0318 18:19:00.853466 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerStarted","Data":"f31604bdb46ac907ccdcaef5046091d2eb886d3ba0ef4a6934b715f92cf8bb0f"} Mar 18 18:19:00 crc kubenswrapper[5008]: I0318 18:19:00.853475 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerStarted","Data":"cbebd7662acf8a9cde2d43122ee8d14310dd8a00620ec93a7c59011758331406"} Mar 18 18:19:00 crc kubenswrapper[5008]: I0318 18:19:00.853484 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerStarted","Data":"d139ddce78ec55bbe2178df5f96185e7f70c4e1e2f61e1974f282513f03f0ac4"} Mar 18 18:19:00 crc kubenswrapper[5008]: I0318 18:19:00.853491 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerStarted","Data":"eeae847a0a23a77fd5b4476b766846041f4141240c31cad91c419f266bc26d10"} Mar 18 18:19:01 crc kubenswrapper[5008]: I0318 18:19:01.049062 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-pdjkn" Mar 18 18:19:01 crc kubenswrapper[5008]: I0318 18:19:01.863102 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnvl6" event={"ID":"32700067-407c-41a0-8d49-835fd75bb28d","Type":"ContainerStarted","Data":"d857703e4d00c3c4eab61f127f7c69d0928b0512c216345fc60e0c70aaaa3cf2"} Mar 18 18:19:01 crc kubenswrapper[5008]: I0318 18:19:01.863790 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:19:01 crc kubenswrapper[5008]: I0318 18:19:01.891216 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-mnvl6" podStartSLOduration=5.158747193 podStartE2EDuration="12.891185208s" podCreationTimestamp="2026-03-18 18:18:49 +0000 UTC" firstStartedPulling="2026-03-18 18:18:49.65209451 +0000 UTC m=+986.171567599" lastFinishedPulling="2026-03-18 18:18:57.384532525 +0000 UTC m=+993.904005614" observedRunningTime="2026-03-18 18:19:01.883232916 +0000 UTC m=+998.402705995" watchObservedRunningTime="2026-03-18 18:19:01.891185208 +0000 UTC m=+998.410658327" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.733658 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7"] Mar 18 18:19:02 crc kubenswrapper[5008]: E0318 18:19:02.734078 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerName="extract-content" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.734097 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerName="extract-content" Mar 18 18:19:02 crc kubenswrapper[5008]: E0318 18:19:02.734118 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerName="extract-utilities" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.734126 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerName="extract-utilities" Mar 18 18:19:02 crc kubenswrapper[5008]: E0318 18:19:02.734138 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerName="registry-server" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.734149 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerName="registry-server" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.735656 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f63bef-fe93-4f7e-80f1-ebf4cc703378" containerName="registry-server" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.737298 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.740314 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.750083 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7"] Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.769621 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.769670 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.769738 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf7hh\" (UniqueName: \"kubernetes.io/projected/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-kube-api-access-nf7hh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.870699 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf7hh\" (UniqueName: \"kubernetes.io/projected/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-kube-api-access-nf7hh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.871166 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.871201 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.871680 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.871774 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:02 crc kubenswrapper[5008]: I0318 18:19:02.891929 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf7hh\" (UniqueName: \"kubernetes.io/projected/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-kube-api-access-nf7hh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:03 crc kubenswrapper[5008]: I0318 18:19:03.070196 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:03 crc kubenswrapper[5008]: I0318 18:19:03.527158 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7"] Mar 18 18:19:03 crc kubenswrapper[5008]: W0318 18:19:03.538230 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a50008d_38c2_4192_8a3f_8e18d2e0ef41.slice/crio-a3ca4a4849b41f9f78453b9828c1d0cbae52ea1745455ddb7bece2725e342265 WatchSource:0}: Error finding container a3ca4a4849b41f9f78453b9828c1d0cbae52ea1745455ddb7bece2725e342265: Status 404 returned error can't find the container with id a3ca4a4849b41f9f78453b9828c1d0cbae52ea1745455ddb7bece2725e342265 Mar 18 18:19:03 crc kubenswrapper[5008]: I0318 18:19:03.878222 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" event={"ID":"0a50008d-38c2-4192-8a3f-8e18d2e0ef41","Type":"ContainerStarted","Data":"a3ca4a4849b41f9f78453b9828c1d0cbae52ea1745455ddb7bece2725e342265"} Mar 18 18:19:04 crc kubenswrapper[5008]: I0318 18:19:04.451660 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:19:04 crc kubenswrapper[5008]: I0318 18:19:04.504902 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:19:04 crc kubenswrapper[5008]: I0318 18:19:04.885685 5008 generic.go:334] "Generic (PLEG): container finished" podID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerID="3cf18cced6ea801405054cf657f69e27ae34af4833a960104997b313fda250da" exitCode=0 Mar 18 18:19:04 crc kubenswrapper[5008]: I0318 18:19:04.885735 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" event={"ID":"0a50008d-38c2-4192-8a3f-8e18d2e0ef41","Type":"ContainerDied","Data":"3cf18cced6ea801405054cf657f69e27ae34af4833a960104997b313fda250da"} Mar 18 18:19:08 crc kubenswrapper[5008]: I0318 18:19:08.911614 5008 generic.go:334] "Generic (PLEG): container finished" podID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerID="2a36adce3d2055e01f71bb7c74e0a873dbfb61052cf25b7e13ea64827f0a3563" exitCode=0 Mar 18 18:19:08 crc kubenswrapper[5008]: I0318 18:19:08.911756 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" event={"ID":"0a50008d-38c2-4192-8a3f-8e18d2e0ef41","Type":"ContainerDied","Data":"2a36adce3d2055e01f71bb7c74e0a873dbfb61052cf25b7e13ea64827f0a3563"} Mar 18 18:19:09 crc kubenswrapper[5008]: I0318 18:19:09.454687 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-mnvl6" Mar 18 18:19:09 crc kubenswrapper[5008]: I0318 18:19:09.475689 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5ctcq" Mar 18 18:19:09 crc kubenswrapper[5008]: I0318 18:19:09.618115 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-q5drs" Mar 18 18:19:09 crc kubenswrapper[5008]: I0318 18:19:09.920464 5008 generic.go:334] "Generic (PLEG): container finished" podID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerID="6b455a50f2ed149e6cbe1e9a89783d0de0e0ec4d2b98c6e9a100c63397c02249" exitCode=0 Mar 18 18:19:09 crc kubenswrapper[5008]: I0318 18:19:09.920504 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" event={"ID":"0a50008d-38c2-4192-8a3f-8e18d2e0ef41","Type":"ContainerDied","Data":"6b455a50f2ed149e6cbe1e9a89783d0de0e0ec4d2b98c6e9a100c63397c02249"} Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.161392 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.294412 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf7hh\" (UniqueName: \"kubernetes.io/projected/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-kube-api-access-nf7hh\") pod \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.294489 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-util\") pod \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.294583 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-bundle\") pod \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\" (UID: \"0a50008d-38c2-4192-8a3f-8e18d2e0ef41\") " Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.295846 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-bundle" (OuterVolumeSpecName: "bundle") pod "0a50008d-38c2-4192-8a3f-8e18d2e0ef41" (UID: "0a50008d-38c2-4192-8a3f-8e18d2e0ef41"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.304524 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-kube-api-access-nf7hh" (OuterVolumeSpecName: "kube-api-access-nf7hh") pod "0a50008d-38c2-4192-8a3f-8e18d2e0ef41" (UID: "0a50008d-38c2-4192-8a3f-8e18d2e0ef41"). InnerVolumeSpecName "kube-api-access-nf7hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.338655 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-util" (OuterVolumeSpecName: "util") pod "0a50008d-38c2-4192-8a3f-8e18d2e0ef41" (UID: "0a50008d-38c2-4192-8a3f-8e18d2e0ef41"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.396089 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf7hh\" (UniqueName: \"kubernetes.io/projected/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-kube-api-access-nf7hh\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.396136 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-util\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.396151 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0a50008d-38c2-4192-8a3f-8e18d2e0ef41-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.938613 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" event={"ID":"0a50008d-38c2-4192-8a3f-8e18d2e0ef41","Type":"ContainerDied","Data":"a3ca4a4849b41f9f78453b9828c1d0cbae52ea1745455ddb7bece2725e342265"} Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.938655 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3ca4a4849b41f9f78453b9828c1d0cbae52ea1745455ddb7bece2725e342265" Mar 18 18:19:11 crc kubenswrapper[5008]: I0318 18:19:11.938686 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.281190 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj"] Mar 18 18:19:16 crc kubenswrapper[5008]: E0318 18:19:16.281400 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerName="util" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.281410 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerName="util" Mar 18 18:19:16 crc kubenswrapper[5008]: E0318 18:19:16.281424 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerName="pull" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.281429 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerName="pull" Mar 18 18:19:16 crc kubenswrapper[5008]: E0318 18:19:16.281437 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerName="extract" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.281444 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerName="extract" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.281537 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a50008d-38c2-4192-8a3f-8e18d2e0ef41" containerName="extract" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.281966 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.286533 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.286885 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.287585 5008 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-djdcg" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.314193 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29335424-4552-4b00-af47-e6bff3cb79a9-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4w9nj\" (UID: \"29335424-4552-4b00-af47-e6bff3cb79a9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.314262 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lskv\" (UniqueName: \"kubernetes.io/projected/29335424-4552-4b00-af47-e6bff3cb79a9-kube-api-access-9lskv\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4w9nj\" (UID: \"29335424-4552-4b00-af47-e6bff3cb79a9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.351807 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj"] Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.415673 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29335424-4552-4b00-af47-e6bff3cb79a9-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4w9nj\" (UID: \"29335424-4552-4b00-af47-e6bff3cb79a9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.415762 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lskv\" (UniqueName: \"kubernetes.io/projected/29335424-4552-4b00-af47-e6bff3cb79a9-kube-api-access-9lskv\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4w9nj\" (UID: \"29335424-4552-4b00-af47-e6bff3cb79a9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.416250 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29335424-4552-4b00-af47-e6bff3cb79a9-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4w9nj\" (UID: \"29335424-4552-4b00-af47-e6bff3cb79a9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.441574 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lskv\" (UniqueName: \"kubernetes.io/projected/29335424-4552-4b00-af47-e6bff3cb79a9-kube-api-access-9lskv\") pod \"cert-manager-operator-controller-manager-66c8bdd694-4w9nj\" (UID: \"29335424-4552-4b00-af47-e6bff3cb79a9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" Mar 18 18:19:16 crc kubenswrapper[5008]: I0318 18:19:16.603173 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" Mar 18 18:19:17 crc kubenswrapper[5008]: I0318 18:19:17.020371 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj"] Mar 18 18:19:17 crc kubenswrapper[5008]: I0318 18:19:17.987312 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" event={"ID":"29335424-4552-4b00-af47-e6bff3cb79a9","Type":"ContainerStarted","Data":"c6a408564bf5be867f2343c4a493e985f4301409a31013fbd15ec76eeee0a0f5"} Mar 18 18:19:21 crc kubenswrapper[5008]: I0318 18:19:21.020415 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" event={"ID":"29335424-4552-4b00-af47-e6bff3cb79a9","Type":"ContainerStarted","Data":"296e76e6b710a00087940185b1338f027453de90af2dd3618021b814e88c4565"} Mar 18 18:19:21 crc kubenswrapper[5008]: I0318 18:19:21.054258 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-4w9nj" podStartSLOduration=1.728305744 podStartE2EDuration="5.054237465s" podCreationTimestamp="2026-03-18 18:19:16 +0000 UTC" firstStartedPulling="2026-03-18 18:19:17.028081027 +0000 UTC m=+1013.547554106" lastFinishedPulling="2026-03-18 18:19:20.354012738 +0000 UTC m=+1016.873485827" observedRunningTime="2026-03-18 18:19:21.050847265 +0000 UTC m=+1017.570320344" watchObservedRunningTime="2026-03-18 18:19:21.054237465 +0000 UTC m=+1017.573710534" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.220397 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2wx9n"] Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.223009 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.228478 5008 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zxhdc" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.228902 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.228959 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.230281 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2wx9n"] Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.424053 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2wx9n\" (UID: \"edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb\") " pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.424107 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq6vn\" (UniqueName: \"kubernetes.io/projected/edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb-kube-api-access-zq6vn\") pod \"cert-manager-webhook-6888856db4-2wx9n\" (UID: \"edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb\") " pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.525529 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2wx9n\" (UID: \"edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb\") " pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.525684 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq6vn\" (UniqueName: \"kubernetes.io/projected/edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb-kube-api-access-zq6vn\") pod \"cert-manager-webhook-6888856db4-2wx9n\" (UID: \"edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb\") " pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.561534 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2wx9n\" (UID: \"edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb\") " pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.562408 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq6vn\" (UniqueName: \"kubernetes.io/projected/edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb-kube-api-access-zq6vn\") pod \"cert-manager-webhook-6888856db4-2wx9n\" (UID: \"edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb\") " pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:25 crc kubenswrapper[5008]: I0318 18:19:25.841262 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:26 crc kubenswrapper[5008]: I0318 18:19:26.328712 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2wx9n"] Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.059028 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" event={"ID":"edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb","Type":"ContainerStarted","Data":"f624d90d5a4242a922553e99203f2b35bcc897c997a1159ddaa077de00106573"} Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.356952 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8m92h"] Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.358081 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.362066 5008 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zrg7l" Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.370203 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8m92h"] Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.553003 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96b8dcae-2676-4c66-9902-ea0976801d41-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8m92h\" (UID: \"96b8dcae-2676-4c66-9902-ea0976801d41\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.553061 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wf2h\" (UniqueName: \"kubernetes.io/projected/96b8dcae-2676-4c66-9902-ea0976801d41-kube-api-access-2wf2h\") pod \"cert-manager-cainjector-5545bd876-8m92h\" (UID: \"96b8dcae-2676-4c66-9902-ea0976801d41\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.653967 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wf2h\" (UniqueName: \"kubernetes.io/projected/96b8dcae-2676-4c66-9902-ea0976801d41-kube-api-access-2wf2h\") pod \"cert-manager-cainjector-5545bd876-8m92h\" (UID: \"96b8dcae-2676-4c66-9902-ea0976801d41\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.654076 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96b8dcae-2676-4c66-9902-ea0976801d41-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8m92h\" (UID: \"96b8dcae-2676-4c66-9902-ea0976801d41\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.674749 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96b8dcae-2676-4c66-9902-ea0976801d41-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8m92h\" (UID: \"96b8dcae-2676-4c66-9902-ea0976801d41\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.685266 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wf2h\" (UniqueName: \"kubernetes.io/projected/96b8dcae-2676-4c66-9902-ea0976801d41-kube-api-access-2wf2h\") pod \"cert-manager-cainjector-5545bd876-8m92h\" (UID: \"96b8dcae-2676-4c66-9902-ea0976801d41\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" Mar 18 18:19:27 crc kubenswrapper[5008]: I0318 18:19:27.975022 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" Mar 18 18:19:28 crc kubenswrapper[5008]: I0318 18:19:28.437747 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8m92h"] Mar 18 18:19:28 crc kubenswrapper[5008]: W0318 18:19:28.443907 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96b8dcae_2676_4c66_9902_ea0976801d41.slice/crio-e8eafc1ac6540d9434a7fc8ca5e9756dc39d1038accdb85ae71aaf176e9fd136 WatchSource:0}: Error finding container e8eafc1ac6540d9434a7fc8ca5e9756dc39d1038accdb85ae71aaf176e9fd136: Status 404 returned error can't find the container with id e8eafc1ac6540d9434a7fc8ca5e9756dc39d1038accdb85ae71aaf176e9fd136 Mar 18 18:19:29 crc kubenswrapper[5008]: I0318 18:19:29.076973 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" event={"ID":"96b8dcae-2676-4c66-9902-ea0976801d41","Type":"ContainerStarted","Data":"e8eafc1ac6540d9434a7fc8ca5e9756dc39d1038accdb85ae71aaf176e9fd136"} Mar 18 18:19:31 crc kubenswrapper[5008]: I0318 18:19:31.090011 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" event={"ID":"96b8dcae-2676-4c66-9902-ea0976801d41","Type":"ContainerStarted","Data":"47d07fe6dcd75998d9fc4311fd3ccbe95dd14516ef17e5eb6691a94fa90678a6"} Mar 18 18:19:31 crc kubenswrapper[5008]: I0318 18:19:31.092486 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" event={"ID":"edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb","Type":"ContainerStarted","Data":"a83b9110aee8fca0eb844296c70d82a5f3dbe02d79c6e7bc2ab7bd05279cae72"} Mar 18 18:19:31 crc kubenswrapper[5008]: I0318 18:19:31.092710 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:31 crc kubenswrapper[5008]: I0318 18:19:31.112257 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-8m92h" podStartSLOduration=1.923941938 podStartE2EDuration="4.112234847s" podCreationTimestamp="2026-03-18 18:19:27 +0000 UTC" firstStartedPulling="2026-03-18 18:19:28.449008223 +0000 UTC m=+1024.968481302" lastFinishedPulling="2026-03-18 18:19:30.637301132 +0000 UTC m=+1027.156774211" observedRunningTime="2026-03-18 18:19:31.106314249 +0000 UTC m=+1027.625787328" watchObservedRunningTime="2026-03-18 18:19:31.112234847 +0000 UTC m=+1027.631707926" Mar 18 18:19:31 crc kubenswrapper[5008]: I0318 18:19:31.139000 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" podStartSLOduration=1.809517328 podStartE2EDuration="6.138977419s" podCreationTimestamp="2026-03-18 18:19:25 +0000 UTC" firstStartedPulling="2026-03-18 18:19:26.338450476 +0000 UTC m=+1022.857923545" lastFinishedPulling="2026-03-18 18:19:30.667910557 +0000 UTC m=+1027.187383636" observedRunningTime="2026-03-18 18:19:31.133775331 +0000 UTC m=+1027.653248440" watchObservedRunningTime="2026-03-18 18:19:31.138977419 +0000 UTC m=+1027.658450518" Mar 18 18:19:33 crc kubenswrapper[5008]: I0318 18:19:33.863927 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-7zrmj"] Mar 18 18:19:33 crc kubenswrapper[5008]: I0318 18:19:33.865633 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-7zrmj" Mar 18 18:19:33 crc kubenswrapper[5008]: I0318 18:19:33.868285 5008 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-st2xj" Mar 18 18:19:33 crc kubenswrapper[5008]: I0318 18:19:33.885487 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-7zrmj"] Mar 18 18:19:33 crc kubenswrapper[5008]: I0318 18:19:33.938080 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xlpw\" (UniqueName: \"kubernetes.io/projected/dffaebea-0338-4742-ab2a-801b071ae679-kube-api-access-5xlpw\") pod \"cert-manager-545d4d4674-7zrmj\" (UID: \"dffaebea-0338-4742-ab2a-801b071ae679\") " pod="cert-manager/cert-manager-545d4d4674-7zrmj" Mar 18 18:19:33 crc kubenswrapper[5008]: I0318 18:19:33.938337 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dffaebea-0338-4742-ab2a-801b071ae679-bound-sa-token\") pod \"cert-manager-545d4d4674-7zrmj\" (UID: \"dffaebea-0338-4742-ab2a-801b071ae679\") " pod="cert-manager/cert-manager-545d4d4674-7zrmj" Mar 18 18:19:34 crc kubenswrapper[5008]: I0318 18:19:34.039318 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xlpw\" (UniqueName: \"kubernetes.io/projected/dffaebea-0338-4742-ab2a-801b071ae679-kube-api-access-5xlpw\") pod \"cert-manager-545d4d4674-7zrmj\" (UID: \"dffaebea-0338-4742-ab2a-801b071ae679\") " pod="cert-manager/cert-manager-545d4d4674-7zrmj" Mar 18 18:19:34 crc kubenswrapper[5008]: I0318 18:19:34.039360 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dffaebea-0338-4742-ab2a-801b071ae679-bound-sa-token\") pod \"cert-manager-545d4d4674-7zrmj\" (UID: \"dffaebea-0338-4742-ab2a-801b071ae679\") " pod="cert-manager/cert-manager-545d4d4674-7zrmj" Mar 18 18:19:34 crc kubenswrapper[5008]: I0318 18:19:34.057168 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xlpw\" (UniqueName: \"kubernetes.io/projected/dffaebea-0338-4742-ab2a-801b071ae679-kube-api-access-5xlpw\") pod \"cert-manager-545d4d4674-7zrmj\" (UID: \"dffaebea-0338-4742-ab2a-801b071ae679\") " pod="cert-manager/cert-manager-545d4d4674-7zrmj" Mar 18 18:19:34 crc kubenswrapper[5008]: I0318 18:19:34.058952 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dffaebea-0338-4742-ab2a-801b071ae679-bound-sa-token\") pod \"cert-manager-545d4d4674-7zrmj\" (UID: \"dffaebea-0338-4742-ab2a-801b071ae679\") " pod="cert-manager/cert-manager-545d4d4674-7zrmj" Mar 18 18:19:34 crc kubenswrapper[5008]: I0318 18:19:34.184031 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-7zrmj" Mar 18 18:19:34 crc kubenswrapper[5008]: I0318 18:19:34.591149 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-7zrmj"] Mar 18 18:19:34 crc kubenswrapper[5008]: W0318 18:19:34.596020 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddffaebea_0338_4742_ab2a_801b071ae679.slice/crio-94a80b56d4199641e674dbdc221da0ce733cd57744851c8c8788a5926e780b39 WatchSource:0}: Error finding container 94a80b56d4199641e674dbdc221da0ce733cd57744851c8c8788a5926e780b39: Status 404 returned error can't find the container with id 94a80b56d4199641e674dbdc221da0ce733cd57744851c8c8788a5926e780b39 Mar 18 18:19:35 crc kubenswrapper[5008]: I0318 18:19:35.131162 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-7zrmj" event={"ID":"dffaebea-0338-4742-ab2a-801b071ae679","Type":"ContainerStarted","Data":"94a80b56d4199641e674dbdc221da0ce733cd57744851c8c8788a5926e780b39"} Mar 18 18:19:35 crc kubenswrapper[5008]: I0318 18:19:35.845963 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-2wx9n" Mar 18 18:19:36 crc kubenswrapper[5008]: I0318 18:19:36.142972 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-7zrmj" event={"ID":"dffaebea-0338-4742-ab2a-801b071ae679","Type":"ContainerStarted","Data":"3c51aedc957731ac40dfb66dfefc0e71461493b4a5ecbeea7f4bd35b721145ff"} Mar 18 18:19:36 crc kubenswrapper[5008]: I0318 18:19:36.162661 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-7zrmj" podStartSLOduration=3.1626438869999998 podStartE2EDuration="3.162643887s" podCreationTimestamp="2026-03-18 18:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:19:36.159259426 +0000 UTC m=+1032.678732505" watchObservedRunningTime="2026-03-18 18:19:36.162643887 +0000 UTC m=+1032.682116966" Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.067875 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hrzg6"] Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.069127 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hrzg6" Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.072908 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.073276 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.077018 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cgxct" Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.102293 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hrzg6"] Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.125041 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7n7v\" (UniqueName: \"kubernetes.io/projected/45bb4ac1-daae-4412-8c26-38429a1f1182-kube-api-access-m7n7v\") pod \"openstack-operator-index-hrzg6\" (UID: \"45bb4ac1-daae-4412-8c26-38429a1f1182\") " pod="openstack-operators/openstack-operator-index-hrzg6" Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.226322 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7n7v\" (UniqueName: \"kubernetes.io/projected/45bb4ac1-daae-4412-8c26-38429a1f1182-kube-api-access-m7n7v\") pod \"openstack-operator-index-hrzg6\" (UID: \"45bb4ac1-daae-4412-8c26-38429a1f1182\") " pod="openstack-operators/openstack-operator-index-hrzg6" Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.248999 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7n7v\" (UniqueName: \"kubernetes.io/projected/45bb4ac1-daae-4412-8c26-38429a1f1182-kube-api-access-m7n7v\") pod \"openstack-operator-index-hrzg6\" (UID: \"45bb4ac1-daae-4412-8c26-38429a1f1182\") " pod="openstack-operators/openstack-operator-index-hrzg6" Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.391702 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hrzg6" Mar 18 18:19:39 crc kubenswrapper[5008]: I0318 18:19:39.828660 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hrzg6"] Mar 18 18:19:40 crc kubenswrapper[5008]: I0318 18:19:40.165155 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hrzg6" event={"ID":"45bb4ac1-daae-4412-8c26-38429a1f1182","Type":"ContainerStarted","Data":"f0851e93851a39bcd957d95277b33f2f97fa4762097d668ed3cf2cab42c9ee8f"} Mar 18 18:19:41 crc kubenswrapper[5008]: I0318 18:19:41.174181 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hrzg6" event={"ID":"45bb4ac1-daae-4412-8c26-38429a1f1182","Type":"ContainerStarted","Data":"f2d5fe1716e5ce94dd6d32129408fd9260ede97b7700fadc57a580b17b307657"} Mar 18 18:19:41 crc kubenswrapper[5008]: I0318 18:19:41.197066 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hrzg6" podStartSLOduration=1.177601163 podStartE2EDuration="2.197040303s" podCreationTimestamp="2026-03-18 18:19:39 +0000 UTC" firstStartedPulling="2026-03-18 18:19:39.838324507 +0000 UTC m=+1036.357797606" lastFinishedPulling="2026-03-18 18:19:40.857763667 +0000 UTC m=+1037.377236746" observedRunningTime="2026-03-18 18:19:41.195859373 +0000 UTC m=+1037.715332472" watchObservedRunningTime="2026-03-18 18:19:41.197040303 +0000 UTC m=+1037.716513412" Mar 18 18:19:49 crc kubenswrapper[5008]: I0318 18:19:49.392442 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hrzg6" Mar 18 18:19:49 crc kubenswrapper[5008]: I0318 18:19:49.392989 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hrzg6" Mar 18 18:19:49 crc kubenswrapper[5008]: I0318 18:19:49.434872 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hrzg6" Mar 18 18:19:50 crc kubenswrapper[5008]: I0318 18:19:50.281876 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hrzg6" Mar 18 18:19:55 crc kubenswrapper[5008]: I0318 18:19:55.911445 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z"] Mar 18 18:19:55 crc kubenswrapper[5008]: I0318 18:19:55.914427 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:55 crc kubenswrapper[5008]: I0318 18:19:55.917691 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-v5bvj" Mar 18 18:19:55 crc kubenswrapper[5008]: I0318 18:19:55.929792 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z"] Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.075440 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.075794 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4pqs\" (UniqueName: \"kubernetes.io/projected/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-kube-api-access-f4pqs\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.075923 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.178310 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.177442 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.178516 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4pqs\" (UniqueName: \"kubernetes.io/projected/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-kube-api-access-f4pqs\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.179117 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.179675 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.268362 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4pqs\" (UniqueName: \"kubernetes.io/projected/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-kube-api-access-f4pqs\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:56 crc kubenswrapper[5008]: I0318 18:19:56.557419 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:19:57 crc kubenswrapper[5008]: I0318 18:19:57.083487 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z"] Mar 18 18:19:57 crc kubenswrapper[5008]: I0318 18:19:57.305605 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" event={"ID":"b12d4842-a3e7-43d9-b1d2-29e90b8aa247","Type":"ContainerStarted","Data":"befa4a5e68f3e196230ed66cd38dfe0db6b8efdeb3d3021546bc9747a8c984c9"} Mar 18 18:19:57 crc kubenswrapper[5008]: I0318 18:19:57.305666 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" event={"ID":"b12d4842-a3e7-43d9-b1d2-29e90b8aa247","Type":"ContainerStarted","Data":"4a9cab608abca79681eb8db376b82520a4650c50eeb66caf0f83c91a6974d857"} Mar 18 18:19:58 crc kubenswrapper[5008]: I0318 18:19:58.319870 5008 generic.go:334] "Generic (PLEG): container finished" podID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerID="befa4a5e68f3e196230ed66cd38dfe0db6b8efdeb3d3021546bc9747a8c984c9" exitCode=0 Mar 18 18:19:58 crc kubenswrapper[5008]: I0318 18:19:58.320355 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" event={"ID":"b12d4842-a3e7-43d9-b1d2-29e90b8aa247","Type":"ContainerDied","Data":"befa4a5e68f3e196230ed66cd38dfe0db6b8efdeb3d3021546bc9747a8c984c9"} Mar 18 18:19:59 crc kubenswrapper[5008]: I0318 18:19:59.344720 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" event={"ID":"b12d4842-a3e7-43d9-b1d2-29e90b8aa247","Type":"ContainerStarted","Data":"aa7d60fa3801bbf5c6448085df38e7cd3cf236c16b4b1d8afef85dfc383486e5"} Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.140172 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564300-58hhx"] Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.144223 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-58hhx" Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.148180 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.149178 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.150652 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-58hhx"] Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.153699 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.261961 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbzb\" (UniqueName: \"kubernetes.io/projected/dd02a294-4a36-457a-9573-6cb6d07c841b-kube-api-access-zwbzb\") pod \"auto-csr-approver-29564300-58hhx\" (UID: \"dd02a294-4a36-457a-9573-6cb6d07c841b\") " pod="openshift-infra/auto-csr-approver-29564300-58hhx" Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.356100 5008 generic.go:334] "Generic (PLEG): container finished" podID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerID="aa7d60fa3801bbf5c6448085df38e7cd3cf236c16b4b1d8afef85dfc383486e5" exitCode=0 Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.356171 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" event={"ID":"b12d4842-a3e7-43d9-b1d2-29e90b8aa247","Type":"ContainerDied","Data":"aa7d60fa3801bbf5c6448085df38e7cd3cf236c16b4b1d8afef85dfc383486e5"} Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.363693 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbzb\" (UniqueName: \"kubernetes.io/projected/dd02a294-4a36-457a-9573-6cb6d07c841b-kube-api-access-zwbzb\") pod \"auto-csr-approver-29564300-58hhx\" (UID: \"dd02a294-4a36-457a-9573-6cb6d07c841b\") " pod="openshift-infra/auto-csr-approver-29564300-58hhx" Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.386284 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbzb\" (UniqueName: \"kubernetes.io/projected/dd02a294-4a36-457a-9573-6cb6d07c841b-kube-api-access-zwbzb\") pod \"auto-csr-approver-29564300-58hhx\" (UID: \"dd02a294-4a36-457a-9573-6cb6d07c841b\") " pod="openshift-infra/auto-csr-approver-29564300-58hhx" Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.475980 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-58hhx" Mar 18 18:20:00 crc kubenswrapper[5008]: I0318 18:20:00.930839 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-58hhx"] Mar 18 18:20:00 crc kubenswrapper[5008]: W0318 18:20:00.951358 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd02a294_4a36_457a_9573_6cb6d07c841b.slice/crio-adc54f357f9701bc7f6b65980b9c58026ce78944fd0801bd41c237e1483c0756 WatchSource:0}: Error finding container adc54f357f9701bc7f6b65980b9c58026ce78944fd0801bd41c237e1483c0756: Status 404 returned error can't find the container with id adc54f357f9701bc7f6b65980b9c58026ce78944fd0801bd41c237e1483c0756 Mar 18 18:20:01 crc kubenswrapper[5008]: I0318 18:20:01.363342 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564300-58hhx" event={"ID":"dd02a294-4a36-457a-9573-6cb6d07c841b","Type":"ContainerStarted","Data":"adc54f357f9701bc7f6b65980b9c58026ce78944fd0801bd41c237e1483c0756"} Mar 18 18:20:02 crc kubenswrapper[5008]: I0318 18:20:02.376018 5008 generic.go:334] "Generic (PLEG): container finished" podID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerID="3341f73f7d55f81c20e044407f59fe551841717c222791033fbd65a88fd1f33e" exitCode=0 Mar 18 18:20:02 crc kubenswrapper[5008]: I0318 18:20:02.376060 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" event={"ID":"b12d4842-a3e7-43d9-b1d2-29e90b8aa247","Type":"ContainerDied","Data":"3341f73f7d55f81c20e044407f59fe551841717c222791033fbd65a88fd1f33e"} Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.385814 5008 generic.go:334] "Generic (PLEG): container finished" podID="dd02a294-4a36-457a-9573-6cb6d07c841b" containerID="7139634e0568e4bad331a4250ab028a69538c5ecaa474e7a149aec84cf48a5a5" exitCode=0 Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.385910 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564300-58hhx" event={"ID":"dd02a294-4a36-457a-9573-6cb6d07c841b","Type":"ContainerDied","Data":"7139634e0568e4bad331a4250ab028a69538c5ecaa474e7a149aec84cf48a5a5"} Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.624335 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.815275 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-util\") pod \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.815863 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-bundle\") pod \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.816071 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4pqs\" (UniqueName: \"kubernetes.io/projected/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-kube-api-access-f4pqs\") pod \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\" (UID: \"b12d4842-a3e7-43d9-b1d2-29e90b8aa247\") " Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.816363 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-bundle" (OuterVolumeSpecName: "bundle") pod "b12d4842-a3e7-43d9-b1d2-29e90b8aa247" (UID: "b12d4842-a3e7-43d9-b1d2-29e90b8aa247"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.816707 5008 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.830595 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-kube-api-access-f4pqs" (OuterVolumeSpecName: "kube-api-access-f4pqs") pod "b12d4842-a3e7-43d9-b1d2-29e90b8aa247" (UID: "b12d4842-a3e7-43d9-b1d2-29e90b8aa247"). InnerVolumeSpecName "kube-api-access-f4pqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.840519 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-util" (OuterVolumeSpecName: "util") pod "b12d4842-a3e7-43d9-b1d2-29e90b8aa247" (UID: "b12d4842-a3e7-43d9-b1d2-29e90b8aa247"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.917255 5008 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-util\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:03 crc kubenswrapper[5008]: I0318 18:20:03.917283 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4pqs\" (UniqueName: \"kubernetes.io/projected/b12d4842-a3e7-43d9-b1d2-29e90b8aa247-kube-api-access-f4pqs\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:05 crc kubenswrapper[5008]: I0318 18:20:05.091961 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" event={"ID":"b12d4842-a3e7-43d9-b1d2-29e90b8aa247","Type":"ContainerDied","Data":"4a9cab608abca79681eb8db376b82520a4650c50eeb66caf0f83c91a6974d857"} Mar 18 18:20:05 crc kubenswrapper[5008]: I0318 18:20:05.092441 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a9cab608abca79681eb8db376b82520a4650c50eeb66caf0f83c91a6974d857" Mar 18 18:20:05 crc kubenswrapper[5008]: I0318 18:20:05.092115 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z" Mar 18 18:20:05 crc kubenswrapper[5008]: I0318 18:20:05.403656 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-58hhx" Mar 18 18:20:05 crc kubenswrapper[5008]: I0318 18:20:05.482286 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwbzb\" (UniqueName: \"kubernetes.io/projected/dd02a294-4a36-457a-9573-6cb6d07c841b-kube-api-access-zwbzb\") pod \"dd02a294-4a36-457a-9573-6cb6d07c841b\" (UID: \"dd02a294-4a36-457a-9573-6cb6d07c841b\") " Mar 18 18:20:05 crc kubenswrapper[5008]: I0318 18:20:05.486603 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd02a294-4a36-457a-9573-6cb6d07c841b-kube-api-access-zwbzb" (OuterVolumeSpecName: "kube-api-access-zwbzb") pod "dd02a294-4a36-457a-9573-6cb6d07c841b" (UID: "dd02a294-4a36-457a-9573-6cb6d07c841b"). InnerVolumeSpecName "kube-api-access-zwbzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:20:05 crc kubenswrapper[5008]: I0318 18:20:05.584282 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwbzb\" (UniqueName: \"kubernetes.io/projected/dd02a294-4a36-457a-9573-6cb6d07c841b-kube-api-access-zwbzb\") on node \"crc\" DevicePath \"\"" Mar 18 18:20:06 crc kubenswrapper[5008]: I0318 18:20:06.101315 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564300-58hhx" event={"ID":"dd02a294-4a36-457a-9573-6cb6d07c841b","Type":"ContainerDied","Data":"adc54f357f9701bc7f6b65980b9c58026ce78944fd0801bd41c237e1483c0756"} Mar 18 18:20:06 crc kubenswrapper[5008]: I0318 18:20:06.101379 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adc54f357f9701bc7f6b65980b9c58026ce78944fd0801bd41c237e1483c0756" Mar 18 18:20:06 crc kubenswrapper[5008]: I0318 18:20:06.101471 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564300-58hhx" Mar 18 18:20:06 crc kubenswrapper[5008]: I0318 18:20:06.478982 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-g5m8z"] Mar 18 18:20:06 crc kubenswrapper[5008]: I0318 18:20:06.491914 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564294-g5m8z"] Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.075742 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s"] Mar 18 18:20:08 crc kubenswrapper[5008]: E0318 18:20:08.076258 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd02a294-4a36-457a-9573-6cb6d07c841b" containerName="oc" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.076270 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd02a294-4a36-457a-9573-6cb6d07c841b" containerName="oc" Mar 18 18:20:08 crc kubenswrapper[5008]: E0318 18:20:08.076285 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerName="util" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.076290 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerName="util" Mar 18 18:20:08 crc kubenswrapper[5008]: E0318 18:20:08.076303 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerName="extract" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.076309 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerName="extract" Mar 18 18:20:08 crc kubenswrapper[5008]: E0318 18:20:08.076317 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerName="pull" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.076323 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerName="pull" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.076426 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd02a294-4a36-457a-9573-6cb6d07c841b" containerName="oc" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.076440 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12d4842-a3e7-43d9-b1d2-29e90b8aa247" containerName="extract" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.076830 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.078759 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-92n4b" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.167782 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s"] Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.206869 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="424dea02-32aa-4bb7-913c-dc9eec1a265b" path="/var/lib/kubelet/pods/424dea02-32aa-4bb7-913c-dc9eec1a265b/volumes" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.218903 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x7sd\" (UniqueName: \"kubernetes.io/projected/f80b85c8-6f72-4dc2-bbfa-b03c0371597a-kube-api-access-7x7sd\") pod \"openstack-operator-controller-init-b85c4d696-mrt6s\" (UID: \"f80b85c8-6f72-4dc2-bbfa-b03c0371597a\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.319902 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x7sd\" (UniqueName: \"kubernetes.io/projected/f80b85c8-6f72-4dc2-bbfa-b03c0371597a-kube-api-access-7x7sd\") pod \"openstack-operator-controller-init-b85c4d696-mrt6s\" (UID: \"f80b85c8-6f72-4dc2-bbfa-b03c0371597a\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.346821 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x7sd\" (UniqueName: \"kubernetes.io/projected/f80b85c8-6f72-4dc2-bbfa-b03c0371597a-kube-api-access-7x7sd\") pod \"openstack-operator-controller-init-b85c4d696-mrt6s\" (UID: \"f80b85c8-6f72-4dc2-bbfa-b03c0371597a\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.392649 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" Mar 18 18:20:08 crc kubenswrapper[5008]: I0318 18:20:08.646505 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s"] Mar 18 18:20:09 crc kubenswrapper[5008]: I0318 18:20:09.124529 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" event={"ID":"f80b85c8-6f72-4dc2-bbfa-b03c0371597a","Type":"ContainerStarted","Data":"5d44ed31e4e951fea62919c591b69b2d74a88923bed95e6f14e3019d77efdb5c"} Mar 18 18:20:14 crc kubenswrapper[5008]: I0318 18:20:14.173964 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" event={"ID":"f80b85c8-6f72-4dc2-bbfa-b03c0371597a","Type":"ContainerStarted","Data":"d58cbee8b74a580a60a9251757fbdba59bc45779c7bd99b8d773ebd8ebb1bd62"} Mar 18 18:20:14 crc kubenswrapper[5008]: I0318 18:20:14.174380 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" Mar 18 18:20:14 crc kubenswrapper[5008]: I0318 18:20:14.210973 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" podStartSLOduration=1.117664393 podStartE2EDuration="6.210952783s" podCreationTimestamp="2026-03-18 18:20:08 +0000 UTC" firstStartedPulling="2026-03-18 18:20:08.654105288 +0000 UTC m=+1065.173578367" lastFinishedPulling="2026-03-18 18:20:13.747393678 +0000 UTC m=+1070.266866757" observedRunningTime="2026-03-18 18:20:14.207710872 +0000 UTC m=+1070.727183991" watchObservedRunningTime="2026-03-18 18:20:14.210952783 +0000 UTC m=+1070.730425882" Mar 18 18:20:18 crc kubenswrapper[5008]: I0318 18:20:18.397718 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-mrt6s" Mar 18 18:20:36 crc kubenswrapper[5008]: I0318 18:20:36.618765 5008 scope.go:117] "RemoveContainer" containerID="9fc3f1598faae046bac8c5b726f1b351063210ed7b800195b88579c2e1a1bc64" Mar 18 18:20:37 crc kubenswrapper[5008]: I0318 18:20:37.952892 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x"] Mar 18 18:20:37 crc kubenswrapper[5008]: I0318 18:20:37.955507 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" Mar 18 18:20:37 crc kubenswrapper[5008]: I0318 18:20:37.965182 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x"] Mar 18 18:20:37 crc kubenswrapper[5008]: I0318 18:20:37.974920 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ckgvg" Mar 18 18:20:37 crc kubenswrapper[5008]: I0318 18:20:37.995062 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:37.997060 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.002501 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.003850 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.021516 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.022764 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.022781 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.022865 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.033177 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-24ddb" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.033465 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-q9r4v" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.033869 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qf7nf" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.059312 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.070702 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.071628 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.079754 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-22kfh" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.095142 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxc7p\" (UniqueName: \"kubernetes.io/projected/0c6afee9-7e01-426d-beb5-7db66667228e-kube-api-access-rxc7p\") pod \"glance-operator-controller-manager-79df6bcc97-thfzw\" (UID: \"0c6afee9-7e01-426d-beb5-7db66667228e\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.095201 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnxnp\" (UniqueName: \"kubernetes.io/projected/d0a49d36-fd45-4ff2-9bb9-f1ccfb048537-kube-api-access-hnxnp\") pod \"designate-operator-controller-manager-588d4d986b-wn4dr\" (UID: \"d0a49d36-fd45-4ff2-9bb9-f1ccfb048537\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.095253 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvcct\" (UniqueName: \"kubernetes.io/projected/c60b7b7f-31b3-49da-b7a3-9559345c180b-kube-api-access-nvcct\") pod \"cinder-operator-controller-manager-8d58dc466-fbwjk\" (UID: \"c60b7b7f-31b3-49da-b7a3-9559345c180b\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.095287 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w7sg\" (UniqueName: \"kubernetes.io/projected/852f1249-96d5-4768-b4a5-cba6a81a00a0-kube-api-access-6w7sg\") pod \"barbican-operator-controller-manager-59bc569d95-cgg7x\" (UID: \"852f1249-96d5-4768-b4a5-cba6a81a00a0\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.108240 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.118662 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.119632 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.124938 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-hbqd6" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.136343 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.141190 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.142053 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.152952 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nggnp" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.153151 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.161671 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.162415 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.165644 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-bpf2j" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.176628 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.177835 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.179530 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-bfl6c" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.190216 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.195919 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvcct\" (UniqueName: \"kubernetes.io/projected/c60b7b7f-31b3-49da-b7a3-9559345c180b-kube-api-access-nvcct\") pod \"cinder-operator-controller-manager-8d58dc466-fbwjk\" (UID: \"c60b7b7f-31b3-49da-b7a3-9559345c180b\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.196145 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76n68\" (UniqueName: \"kubernetes.io/projected/3c744c82-fc71-4e82-8d8c-4f43404a7664-kube-api-access-76n68\") pod \"heat-operator-controller-manager-67dd5f86f5-x9hn5\" (UID: \"3c744c82-fc71-4e82-8d8c-4f43404a7664\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.196255 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w7sg\" (UniqueName: \"kubernetes.io/projected/852f1249-96d5-4768-b4a5-cba6a81a00a0-kube-api-access-6w7sg\") pod \"barbican-operator-controller-manager-59bc569d95-cgg7x\" (UID: \"852f1249-96d5-4768-b4a5-cba6a81a00a0\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.196387 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxc7p\" (UniqueName: \"kubernetes.io/projected/0c6afee9-7e01-426d-beb5-7db66667228e-kube-api-access-rxc7p\") pod \"glance-operator-controller-manager-79df6bcc97-thfzw\" (UID: \"0c6afee9-7e01-426d-beb5-7db66667228e\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.196498 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnxnp\" (UniqueName: \"kubernetes.io/projected/d0a49d36-fd45-4ff2-9bb9-f1ccfb048537-kube-api-access-hnxnp\") pod \"designate-operator-controller-manager-588d4d986b-wn4dr\" (UID: \"d0a49d36-fd45-4ff2-9bb9-f1ccfb048537\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.196615 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klgtt\" (UniqueName: \"kubernetes.io/projected/185ad843-7fab-4ae9-9e83-91c681a93f90-kube-api-access-klgtt\") pod \"horizon-operator-controller-manager-8464cc45fb-stnqj\" (UID: \"185ad843-7fab-4ae9-9e83-91c681a93f90\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.226135 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.226175 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.230472 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnxnp\" (UniqueName: \"kubernetes.io/projected/d0a49d36-fd45-4ff2-9bb9-f1ccfb048537-kube-api-access-hnxnp\") pod \"designate-operator-controller-manager-588d4d986b-wn4dr\" (UID: \"d0a49d36-fd45-4ff2-9bb9-f1ccfb048537\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.235242 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvcct\" (UniqueName: \"kubernetes.io/projected/c60b7b7f-31b3-49da-b7a3-9559345c180b-kube-api-access-nvcct\") pod \"cinder-operator-controller-manager-8d58dc466-fbwjk\" (UID: \"c60b7b7f-31b3-49da-b7a3-9559345c180b\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.241767 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w7sg\" (UniqueName: \"kubernetes.io/projected/852f1249-96d5-4768-b4a5-cba6a81a00a0-kube-api-access-6w7sg\") pod \"barbican-operator-controller-manager-59bc569d95-cgg7x\" (UID: \"852f1249-96d5-4768-b4a5-cba6a81a00a0\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.244455 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.245442 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxc7p\" (UniqueName: \"kubernetes.io/projected/0c6afee9-7e01-426d-beb5-7db66667228e-kube-api-access-rxc7p\") pod \"glance-operator-controller-manager-79df6bcc97-thfzw\" (UID: \"0c6afee9-7e01-426d-beb5-7db66667228e\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.245588 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.247907 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.248362 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5259z" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.250488 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.253928 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-mf4zn" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.276831 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.289519 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.298672 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkwsv\" (UniqueName: \"kubernetes.io/projected/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-kube-api-access-hkwsv\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.298727 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.298752 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm8p9\" (UniqueName: \"kubernetes.io/projected/53e77882-f2cb-4be8-a8ca-4e9118f30a95-kube-api-access-nm8p9\") pod \"ironic-operator-controller-manager-6f787dddc9-4ctxd\" (UID: \"53e77882-f2cb-4be8-a8ca-4e9118f30a95\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.298828 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klgtt\" (UniqueName: \"kubernetes.io/projected/185ad843-7fab-4ae9-9e83-91c681a93f90-kube-api-access-klgtt\") pod \"horizon-operator-controller-manager-8464cc45fb-stnqj\" (UID: \"185ad843-7fab-4ae9-9e83-91c681a93f90\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.298873 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76n68\" (UniqueName: \"kubernetes.io/projected/3c744c82-fc71-4e82-8d8c-4f43404a7664-kube-api-access-76n68\") pod \"heat-operator-controller-manager-67dd5f86f5-x9hn5\" (UID: \"3c744c82-fc71-4e82-8d8c-4f43404a7664\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.298917 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bhnm\" (UniqueName: \"kubernetes.io/projected/7cb2d5ac-6334-4e36-9c90-8554ccd85c4f-kube-api-access-5bhnm\") pod \"keystone-operator-controller-manager-768b96df4c-7fvgk\" (UID: \"7cb2d5ac-6334-4e36-9c90-8554ccd85c4f\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.328444 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klgtt\" (UniqueName: \"kubernetes.io/projected/185ad843-7fab-4ae9-9e83-91c681a93f90-kube-api-access-klgtt\") pod \"horizon-operator-controller-manager-8464cc45fb-stnqj\" (UID: \"185ad843-7fab-4ae9-9e83-91c681a93f90\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.347971 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.360503 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.361088 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.370472 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.371301 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76n68\" (UniqueName: \"kubernetes.io/projected/3c744c82-fc71-4e82-8d8c-4f43404a7664-kube-api-access-76n68\") pod \"heat-operator-controller-manager-67dd5f86f5-x9hn5\" (UID: \"3c744c82-fc71-4e82-8d8c-4f43404a7664\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.387059 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.387580 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.388302 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.389322 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.389344 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-lhml9" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.391027 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kcmzg" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.402632 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psx56\" (UniqueName: \"kubernetes.io/projected/fdaac0ac-492e-438a-b737-55c88fcf77f1-kube-api-access-psx56\") pod \"mariadb-operator-controller-manager-67ccfc9778-thmph\" (UID: \"fdaac0ac-492e-438a-b737-55c88fcf77f1\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.402706 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bhnm\" (UniqueName: \"kubernetes.io/projected/7cb2d5ac-6334-4e36-9c90-8554ccd85c4f-kube-api-access-5bhnm\") pod \"keystone-operator-controller-manager-768b96df4c-7fvgk\" (UID: \"7cb2d5ac-6334-4e36-9c90-8554ccd85c4f\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.402737 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkwsv\" (UniqueName: \"kubernetes.io/projected/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-kube-api-access-hkwsv\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.402764 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.402779 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm8p9\" (UniqueName: \"kubernetes.io/projected/53e77882-f2cb-4be8-a8ca-4e9118f30a95-kube-api-access-nm8p9\") pod \"ironic-operator-controller-manager-6f787dddc9-4ctxd\" (UID: \"53e77882-f2cb-4be8-a8ca-4e9118f30a95\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.402800 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8bk\" (UniqueName: \"kubernetes.io/projected/17f04529-6a55-45ae-a999-888e15d5e9d8-kube-api-access-qj8bk\") pod \"manila-operator-controller-manager-55f864c847-lcvlv\" (UID: \"17f04529-6a55-45ae-a999-888e15d5e9d8\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" Mar 18 18:20:38 crc kubenswrapper[5008]: E0318 18:20:38.403201 5008 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:38 crc kubenswrapper[5008]: E0318 18:20:38.403238 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert podName:b78718c8-6ad4-4bc1-ae6d-26bcfbddf493 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:38.903224364 +0000 UTC m=+1095.422697443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert") pod "infra-operator-controller-manager-7b9c774f96-k9grx" (UID: "b78718c8-6ad4-4bc1-ae6d-26bcfbddf493") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.422427 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.448405 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.453530 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkwsv\" (UniqueName: \"kubernetes.io/projected/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-kube-api-access-hkwsv\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.460937 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.462038 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm8p9\" (UniqueName: \"kubernetes.io/projected/53e77882-f2cb-4be8-a8ca-4e9118f30a95-kube-api-access-nm8p9\") pod \"ironic-operator-controller-manager-6f787dddc9-4ctxd\" (UID: \"53e77882-f2cb-4be8-a8ca-4e9118f30a95\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.480142 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.482891 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bhnm\" (UniqueName: \"kubernetes.io/projected/7cb2d5ac-6334-4e36-9c90-8554ccd85c4f-kube-api-access-5bhnm\") pod \"keystone-operator-controller-manager-768b96df4c-7fvgk\" (UID: \"7cb2d5ac-6334-4e36-9c90-8554ccd85c4f\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.484083 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.499438 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.511208 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk6xn\" (UniqueName: \"kubernetes.io/projected/91590da1-e8d8-47d0-8737-83b39da9214f-kube-api-access-nk6xn\") pod \"nova-operator-controller-manager-5d488d59fb-9w9g2\" (UID: \"91590da1-e8d8-47d0-8737-83b39da9214f\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.511273 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8bk\" (UniqueName: \"kubernetes.io/projected/17f04529-6a55-45ae-a999-888e15d5e9d8-kube-api-access-qj8bk\") pod \"manila-operator-controller-manager-55f864c847-lcvlv\" (UID: \"17f04529-6a55-45ae-a999-888e15d5e9d8\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.511313 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psx56\" (UniqueName: \"kubernetes.io/projected/fdaac0ac-492e-438a-b737-55c88fcf77f1-kube-api-access-psx56\") pod \"mariadb-operator-controller-manager-67ccfc9778-thmph\" (UID: \"fdaac0ac-492e-438a-b737-55c88fcf77f1\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.511341 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2gpr\" (UniqueName: \"kubernetes.io/projected/771f6d56-9294-45d0-bf92-5b59be4313bf-kube-api-access-w2gpr\") pod \"neutron-operator-controller-manager-767865f676-6pf8x\" (UID: \"771f6d56-9294-45d0-bf92-5b59be4313bf\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.533049 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8bk\" (UniqueName: \"kubernetes.io/projected/17f04529-6a55-45ae-a999-888e15d5e9d8-kube-api-access-qj8bk\") pod \"manila-operator-controller-manager-55f864c847-lcvlv\" (UID: \"17f04529-6a55-45ae-a999-888e15d5e9d8\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.548321 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psx56\" (UniqueName: \"kubernetes.io/projected/fdaac0ac-492e-438a-b737-55c88fcf77f1-kube-api-access-psx56\") pod \"mariadb-operator-controller-manager-67ccfc9778-thmph\" (UID: \"fdaac0ac-492e-438a-b737-55c88fcf77f1\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.562259 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.566039 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.578332 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-5f4bl" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.610242 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.612434 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6xn\" (UniqueName: \"kubernetes.io/projected/91590da1-e8d8-47d0-8737-83b39da9214f-kube-api-access-nk6xn\") pod \"nova-operator-controller-manager-5d488d59fb-9w9g2\" (UID: \"91590da1-e8d8-47d0-8737-83b39da9214f\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.612532 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2gpr\" (UniqueName: \"kubernetes.io/projected/771f6d56-9294-45d0-bf92-5b59be4313bf-kube-api-access-w2gpr\") pod \"neutron-operator-controller-manager-767865f676-6pf8x\" (UID: \"771f6d56-9294-45d0-bf92-5b59be4313bf\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.634054 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2gpr\" (UniqueName: \"kubernetes.io/projected/771f6d56-9294-45d0-bf92-5b59be4313bf-kube-api-access-w2gpr\") pod \"neutron-operator-controller-manager-767865f676-6pf8x\" (UID: \"771f6d56-9294-45d0-bf92-5b59be4313bf\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.635155 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk6xn\" (UniqueName: \"kubernetes.io/projected/91590da1-e8d8-47d0-8737-83b39da9214f-kube-api-access-nk6xn\") pod \"nova-operator-controller-manager-5d488d59fb-9w9g2\" (UID: \"91590da1-e8d8-47d0-8737-83b39da9214f\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.636491 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.637257 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.641034 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rgjmh" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.641166 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.676798 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.677843 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.678714 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.679477 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vpt4x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.692410 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.717052 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np8rm\" (UniqueName: \"kubernetes.io/projected/d5e131ed-11af-4026-a4d2-7a25e42e38c9-kube-api-access-np8rm\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.717125 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.717150 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkp9\" (UniqueName: \"kubernetes.io/projected/ccf8ba7b-3e8e-4b28-9705-26a812edfc07-kube-api-access-slkp9\") pod \"octavia-operator-controller-manager-5b9f45d989-k8vzn\" (UID: \"ccf8ba7b-3e8e-4b28-9705-26a812edfc07\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.721540 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.725692 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.727343 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.731049 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5ppmv" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.733675 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.758629 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.766011 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.770780 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.771297 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-65p8p" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.789211 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.790005 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.791865 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.792820 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-2crtx" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.799490 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.810906 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.811010 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.815046 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jqx5c" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.817467 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.818792 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lww2n\" (UniqueName: \"kubernetes.io/projected/847f8b5e-2233-4fb5-964b-2df8309401d6-kube-api-access-lww2n\") pod \"ovn-operator-controller-manager-884679f54-x5ghn\" (UID: \"847f8b5e-2233-4fb5-964b-2df8309401d6\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.818897 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np8rm\" (UniqueName: \"kubernetes.io/projected/d5e131ed-11af-4026-a4d2-7a25e42e38c9-kube-api-access-np8rm\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.818971 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lfg9\" (UniqueName: \"kubernetes.io/projected/5c650888-8d3a-4835-bdc0-2686f8881f62-kube-api-access-2lfg9\") pod \"placement-operator-controller-manager-5784578c99-hg8kc\" (UID: \"5c650888-8d3a-4835-bdc0-2686f8881f62\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.819036 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.819089 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slkp9\" (UniqueName: \"kubernetes.io/projected/ccf8ba7b-3e8e-4b28-9705-26a812edfc07-kube-api-access-slkp9\") pod \"octavia-operator-controller-manager-5b9f45d989-k8vzn\" (UID: \"ccf8ba7b-3e8e-4b28-9705-26a812edfc07\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" Mar 18 18:20:38 crc kubenswrapper[5008]: E0318 18:20:38.819730 5008 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:38 crc kubenswrapper[5008]: E0318 18:20:38.819805 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert podName:d5e131ed-11af-4026-a4d2-7a25e42e38c9 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:39.31978105 +0000 UTC m=+1095.839254129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mb8mj" (UID: "d5e131ed-11af-4026-a4d2-7a25e42e38c9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.831679 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.833670 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.838786 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lqjs8" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.843341 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.857279 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkp9\" (UniqueName: \"kubernetes.io/projected/ccf8ba7b-3e8e-4b28-9705-26a812edfc07-kube-api-access-slkp9\") pod \"octavia-operator-controller-manager-5b9f45d989-k8vzn\" (UID: \"ccf8ba7b-3e8e-4b28-9705-26a812edfc07\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.864615 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np8rm\" (UniqueName: \"kubernetes.io/projected/d5e131ed-11af-4026-a4d2-7a25e42e38c9-kube-api-access-np8rm\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.869850 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.870254 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.871070 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.873258 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.873390 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wcmnq" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.873569 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.877055 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.894268 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.896697 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.901993 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.906666 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.908549 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.910090 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-8rn8j" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.913427 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd"] Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.932984 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcpw6\" (UniqueName: \"kubernetes.io/projected/da6e231d-9d56-4c1c-a10e-b4e258a88c2a-kube-api-access-vcpw6\") pod \"test-operator-controller-manager-5c5cb9c4d7-g6vj6\" (UID: \"da6e231d-9d56-4c1c-a10e-b4e258a88c2a\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.933046 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnsc7\" (UniqueName: \"kubernetes.io/projected/3b9302e6-1019-49f3-a708-d8552045764e-kube-api-access-tnsc7\") pod \"swift-operator-controller-manager-c674c5965-rjdmt\" (UID: \"3b9302e6-1019-49f3-a708-d8552045764e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.933084 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lww2n\" (UniqueName: \"kubernetes.io/projected/847f8b5e-2233-4fb5-964b-2df8309401d6-kube-api-access-lww2n\") pod \"ovn-operator-controller-manager-884679f54-x5ghn\" (UID: \"847f8b5e-2233-4fb5-964b-2df8309401d6\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.933135 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.933175 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tfk9\" (UniqueName: \"kubernetes.io/projected/8d3f1117-f02f-406b-bae8-84e5e71212c1-kube-api-access-6tfk9\") pod \"telemetry-operator-controller-manager-d6b694c5-kqmf8\" (UID: \"8d3f1117-f02f-406b-bae8-84e5e71212c1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.933198 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86c5\" (UniqueName: \"kubernetes.io/projected/1048d1cb-c6cf-47dd-8896-8445f09e6d25-kube-api-access-t86c5\") pod \"watcher-operator-controller-manager-6c4d75f7f9-2v4j6\" (UID: \"1048d1cb-c6cf-47dd-8896-8445f09e6d25\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.933227 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lfg9\" (UniqueName: \"kubernetes.io/projected/5c650888-8d3a-4835-bdc0-2686f8881f62-kube-api-access-2lfg9\") pod \"placement-operator-controller-manager-5784578c99-hg8kc\" (UID: \"5c650888-8d3a-4835-bdc0-2686f8881f62\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" Mar 18 18:20:38 crc kubenswrapper[5008]: E0318 18:20:38.933892 5008 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:38 crc kubenswrapper[5008]: E0318 18:20:38.933940 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert podName:b78718c8-6ad4-4bc1-ae6d-26bcfbddf493 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:39.933924515 +0000 UTC m=+1096.453397594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert") pod "infra-operator-controller-manager-7b9c774f96-k9grx" (UID: "b78718c8-6ad4-4bc1-ae6d-26bcfbddf493") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.966696 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lfg9\" (UniqueName: \"kubernetes.io/projected/5c650888-8d3a-4835-bdc0-2686f8881f62-kube-api-access-2lfg9\") pod \"placement-operator-controller-manager-5784578c99-hg8kc\" (UID: \"5c650888-8d3a-4835-bdc0-2686f8881f62\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" Mar 18 18:20:38 crc kubenswrapper[5008]: I0318 18:20:38.977643 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lww2n\" (UniqueName: \"kubernetes.io/projected/847f8b5e-2233-4fb5-964b-2df8309401d6-kube-api-access-lww2n\") pod \"ovn-operator-controller-manager-884679f54-x5ghn\" (UID: \"847f8b5e-2233-4fb5-964b-2df8309401d6\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.016443 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.017875 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.034883 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcpw6\" (UniqueName: \"kubernetes.io/projected/da6e231d-9d56-4c1c-a10e-b4e258a88c2a-kube-api-access-vcpw6\") pod \"test-operator-controller-manager-5c5cb9c4d7-g6vj6\" (UID: \"da6e231d-9d56-4c1c-a10e-b4e258a88c2a\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.035238 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnsc7\" (UniqueName: \"kubernetes.io/projected/3b9302e6-1019-49f3-a708-d8552045764e-kube-api-access-tnsc7\") pod \"swift-operator-controller-manager-c674c5965-rjdmt\" (UID: \"3b9302e6-1019-49f3-a708-d8552045764e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.035293 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtc9k\" (UniqueName: \"kubernetes.io/projected/280f2ba7-134f-472b-82f4-d3728bbe6d31-kube-api-access-vtc9k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nlftd\" (UID: \"280f2ba7-134f-472b-82f4-d3728bbe6d31\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.035317 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmbjw\" (UniqueName: \"kubernetes.io/projected/002df576-b6e1-4ffd-9eda-5751dcf89505-kube-api-access-rmbjw\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.035371 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86c5\" (UniqueName: \"kubernetes.io/projected/1048d1cb-c6cf-47dd-8896-8445f09e6d25-kube-api-access-t86c5\") pod \"watcher-operator-controller-manager-6c4d75f7f9-2v4j6\" (UID: \"1048d1cb-c6cf-47dd-8896-8445f09e6d25\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.035390 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tfk9\" (UniqueName: \"kubernetes.io/projected/8d3f1117-f02f-406b-bae8-84e5e71212c1-kube-api-access-6tfk9\") pod \"telemetry-operator-controller-manager-d6b694c5-kqmf8\" (UID: \"8d3f1117-f02f-406b-bae8-84e5e71212c1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.035413 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.035452 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.054667 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86c5\" (UniqueName: \"kubernetes.io/projected/1048d1cb-c6cf-47dd-8896-8445f09e6d25-kube-api-access-t86c5\") pod \"watcher-operator-controller-manager-6c4d75f7f9-2v4j6\" (UID: \"1048d1cb-c6cf-47dd-8896-8445f09e6d25\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.056191 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnsc7\" (UniqueName: \"kubernetes.io/projected/3b9302e6-1019-49f3-a708-d8552045764e-kube-api-access-tnsc7\") pod \"swift-operator-controller-manager-c674c5965-rjdmt\" (UID: \"3b9302e6-1019-49f3-a708-d8552045764e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.057111 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tfk9\" (UniqueName: \"kubernetes.io/projected/8d3f1117-f02f-406b-bae8-84e5e71212c1-kube-api-access-6tfk9\") pod \"telemetry-operator-controller-manager-d6b694c5-kqmf8\" (UID: \"8d3f1117-f02f-406b-bae8-84e5e71212c1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.062033 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcpw6\" (UniqueName: \"kubernetes.io/projected/da6e231d-9d56-4c1c-a10e-b4e258a88c2a-kube-api-access-vcpw6\") pod \"test-operator-controller-manager-5c5cb9c4d7-g6vj6\" (UID: \"da6e231d-9d56-4c1c-a10e-b4e258a88c2a\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.075081 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.092364 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.127456 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.132861 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.134848 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.137791 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.137854 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.137915 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtc9k\" (UniqueName: \"kubernetes.io/projected/280f2ba7-134f-472b-82f4-d3728bbe6d31-kube-api-access-vtc9k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nlftd\" (UID: \"280f2ba7-134f-472b-82f4-d3728bbe6d31\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.137936 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmbjw\" (UniqueName: \"kubernetes.io/projected/002df576-b6e1-4ffd-9eda-5751dcf89505-kube-api-access-rmbjw\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.138430 5008 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.138821 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:39.638803557 +0000 UTC m=+1096.158276646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "webhook-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.138870 5008 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.138895 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:39.638888689 +0000 UTC m=+1096.158361768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "metrics-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.151095 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.163131 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmbjw\" (UniqueName: \"kubernetes.io/projected/002df576-b6e1-4ffd-9eda-5751dcf89505-kube-api-access-rmbjw\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.165347 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.167043 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtc9k\" (UniqueName: \"kubernetes.io/projected/280f2ba7-134f-472b-82f4-d3728bbe6d31-kube-api-access-vtc9k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nlftd\" (UID: \"280f2ba7-134f-472b-82f4-d3728bbe6d31\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.228293 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.249313 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw"] Mar 18 18:20:39 crc kubenswrapper[5008]: W0318 18:20:39.249436 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e77882_f2cb_4be8_a8ca_4e9118f30a95.slice/crio-99fd098e052342b363da3b7932b48807b6870d8ad9bbf258c0ec738ccd0c6656 WatchSource:0}: Error finding container 99fd098e052342b363da3b7932b48807b6870d8ad9bbf258c0ec738ccd0c6656: Status 404 returned error can't find the container with id 99fd098e052342b363da3b7932b48807b6870d8ad9bbf258c0ec738ccd0c6656 Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.254210 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.301117 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.346839 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.347005 5008 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.347049 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert podName:d5e131ed-11af-4026-a4d2-7a25e42e38c9 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:40.347036214 +0000 UTC m=+1096.866509283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mb8mj" (UID: "d5e131ed-11af-4026-a4d2-7a25e42e38c9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.379698 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" event={"ID":"53e77882-f2cb-4be8-a8ca-4e9118f30a95","Type":"ContainerStarted","Data":"99fd098e052342b363da3b7932b48807b6870d8ad9bbf258c0ec738ccd0c6656"} Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.380901 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" event={"ID":"d0a49d36-fd45-4ff2-9bb9-f1ccfb048537","Type":"ContainerStarted","Data":"7d3923b7303545d3e69f714d739a811c3371bebcd7be3c657c1ddb35b7da6914"} Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.383641 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" event={"ID":"0c6afee9-7e01-426d-beb5-7db66667228e","Type":"ContainerStarted","Data":"c20865d22aeec763aec03363f1b4a140edc26cee36867551423e8064ccd6a1b7"} Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.425274 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" event={"ID":"185ad843-7fab-4ae9-9e83-91c681a93f90","Type":"ContainerStarted","Data":"f51d8b2a850d2756ce09b3c05c05aaa6b5a46f0430200df4a19813062aa5243c"} Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.427314 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" event={"ID":"3c744c82-fc71-4e82-8d8c-4f43404a7664","Type":"ContainerStarted","Data":"748ece1a00398a1c3a8a64f64a5b520a994369ba4d18571420d9433d24a650a5"} Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.430529 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" event={"ID":"c60b7b7f-31b3-49da-b7a3-9559345c180b","Type":"ContainerStarted","Data":"ebc2d9a1242b3d48ef2e075ac7cd115e6aaa1c68d9fd001953e3e95f96d64e76"} Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.432404 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" event={"ID":"852f1249-96d5-4768-b4a5-cba6a81a00a0","Type":"ContainerStarted","Data":"a0cdc026497693362048ebfddf1aa1620185b254b9c12e25447287100ea4d91a"} Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.467789 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.474985 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.501951 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.511336 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph"] Mar 18 18:20:39 crc kubenswrapper[5008]: W0318 18:20:39.520310 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91590da1_e8d8_47d0_8737_83b39da9214f.slice/crio-6cc8c4a8dd42db622e87a98ecb9ae5db02b60ecb1af4506c29ff034a5b890ea2 WatchSource:0}: Error finding container 6cc8c4a8dd42db622e87a98ecb9ae5db02b60ecb1af4506c29ff034a5b890ea2: Status 404 returned error can't find the container with id 6cc8c4a8dd42db622e87a98ecb9ae5db02b60ecb1af4506c29ff034a5b890ea2 Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.526482 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.537993 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.542753 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn"] Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.547882 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-slkp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-k8vzn_openstack-operators(ccf8ba7b-3e8e-4b28-9705-26a812edfc07): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.551287 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" podUID="ccf8ba7b-3e8e-4b28-9705-26a812edfc07" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.619422 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn"] Mar 18 18:20:39 crc kubenswrapper[5008]: W0318 18:20:39.621288 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod847f8b5e_2233_4fb5_964b_2df8309401d6.slice/crio-82d49af132e5b35480dd429d0e259d42fdaa080d07037fcf2de3d41e2bb5ed17 WatchSource:0}: Error finding container 82d49af132e5b35480dd429d0e259d42fdaa080d07037fcf2de3d41e2bb5ed17: Status 404 returned error can't find the container with id 82d49af132e5b35480dd429d0e259d42fdaa080d07037fcf2de3d41e2bb5ed17 Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.653840 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.653970 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.654037 5008 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.654088 5008 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.654142 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:40.654105561 +0000 UTC m=+1097.173578640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "metrics-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.654165 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:40.654156252 +0000 UTC m=+1097.173629331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "webhook-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.685645 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd"] Mar 18 18:20:39 crc kubenswrapper[5008]: W0318 18:20:39.692525 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280f2ba7_134f_472b_82f4_d3728bbe6d31.slice/crio-00bcbc6d761b1196ed3cd4a62690ce9e995e31565fd16480ebbad73f08a69072 WatchSource:0}: Error finding container 00bcbc6d761b1196ed3cd4a62690ce9e995e31565fd16480ebbad73f08a69072: Status 404 returned error can't find the container with id 00bcbc6d761b1196ed3cd4a62690ce9e995e31565fd16480ebbad73f08a69072 Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.694883 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vtc9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-nlftd_openstack-operators(280f2ba7-134f-472b-82f4-d3728bbe6d31): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.696046 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" podUID="280f2ba7-134f-472b-82f4-d3728bbe6d31" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.733346 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8"] Mar 18 18:20:39 crc kubenswrapper[5008]: W0318 18:20:39.740445 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d3f1117_f02f_406b_bae8_84e5e71212c1.slice/crio-f6be66bb816154c57e2d1273258fd799c3ec9870cdcfffd715aafc5406b3f737 WatchSource:0}: Error finding container f6be66bb816154c57e2d1273258fd799c3ec9870cdcfffd715aafc5406b3f737: Status 404 returned error can't find the container with id f6be66bb816154c57e2d1273258fd799c3ec9870cdcfffd715aafc5406b3f737 Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.743780 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6tfk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-kqmf8_openstack-operators(8d3f1117-f02f-406b-bae8-84e5e71212c1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.745249 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" podUID="8d3f1117-f02f-406b-bae8-84e5e71212c1" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.762182 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tnsc7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-rjdmt_openstack-operators(3b9302e6-1019-49f3-a708-d8552045764e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.763498 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" podUID="3b9302e6-1019-49f3-a708-d8552045764e" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.764015 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.831883 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6"] Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.836764 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6"] Mar 18 18:20:39 crc kubenswrapper[5008]: W0318 18:20:39.843629 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1048d1cb_c6cf_47dd_8896_8445f09e6d25.slice/crio-d6583947fc0eb98132a787cbbe51a8c49a26012b083653c9d5af6d21312b7cc4 WatchSource:0}: Error finding container d6583947fc0eb98132a787cbbe51a8c49a26012b083653c9d5af6d21312b7cc4: Status 404 returned error can't find the container with id d6583947fc0eb98132a787cbbe51a8c49a26012b083653c9d5af6d21312b7cc4 Mar 18 18:20:39 crc kubenswrapper[5008]: W0318 18:20:39.847306 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda6e231d_9d56_4c1c_a10e_b4e258a88c2a.slice/crio-679405d46caf420821cc0fb4c68a045e14192700fa03c154129566c0de1cb8fe WatchSource:0}: Error finding container 679405d46caf420821cc0fb4c68a045e14192700fa03c154129566c0de1cb8fe: Status 404 returned error can't find the container with id 679405d46caf420821cc0fb4c68a045e14192700fa03c154129566c0de1cb8fe Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.852317 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vcpw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-g6vj6_openstack-operators(da6e231d-9d56-4c1c-a10e-b4e258a88c2a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.854118 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" podUID="da6e231d-9d56-4c1c-a10e-b4e258a88c2a" Mar 18 18:20:39 crc kubenswrapper[5008]: I0318 18:20:39.958485 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.958678 5008 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:39 crc kubenswrapper[5008]: E0318 18:20:39.958796 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert podName:b78718c8-6ad4-4bc1-ae6d-26bcfbddf493 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:41.958779527 +0000 UTC m=+1098.478252606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert") pod "infra-operator-controller-manager-7b9c774f96-k9grx" (UID: "b78718c8-6ad4-4bc1-ae6d-26bcfbddf493") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.366401 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.366601 5008 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.366914 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert podName:d5e131ed-11af-4026-a4d2-7a25e42e38c9 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:42.366893401 +0000 UTC m=+1098.886366480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mb8mj" (UID: "d5e131ed-11af-4026-a4d2-7a25e42e38c9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.440545 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" event={"ID":"1048d1cb-c6cf-47dd-8896-8445f09e6d25","Type":"ContainerStarted","Data":"d6583947fc0eb98132a787cbbe51a8c49a26012b083653c9d5af6d21312b7cc4"} Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.442351 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" event={"ID":"91590da1-e8d8-47d0-8737-83b39da9214f","Type":"ContainerStarted","Data":"6cc8c4a8dd42db622e87a98ecb9ae5db02b60ecb1af4506c29ff034a5b890ea2"} Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.444688 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" event={"ID":"280f2ba7-134f-472b-82f4-d3728bbe6d31","Type":"ContainerStarted","Data":"00bcbc6d761b1196ed3cd4a62690ce9e995e31565fd16480ebbad73f08a69072"} Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.446959 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" event={"ID":"ccf8ba7b-3e8e-4b28-9705-26a812edfc07","Type":"ContainerStarted","Data":"51b205ffef0194a0d4d4d6d8e06279395870ded4eea10a561d46eebf6f18f399"} Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.447120 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" podUID="280f2ba7-134f-472b-82f4-d3728bbe6d31" Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.448198 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" event={"ID":"da6e231d-9d56-4c1c-a10e-b4e258a88c2a","Type":"ContainerStarted","Data":"679405d46caf420821cc0fb4c68a045e14192700fa03c154129566c0de1cb8fe"} Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.450135 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" podUID="da6e231d-9d56-4c1c-a10e-b4e258a88c2a" Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.450868 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" event={"ID":"3b9302e6-1019-49f3-a708-d8552045764e","Type":"ContainerStarted","Data":"b9ca5b5793a3e9fb7c7152ff46ba07fa1ea9eddf025fda26d711b6334b417aa7"} Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.452320 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" podUID="3b9302e6-1019-49f3-a708-d8552045764e" Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.453989 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" podUID="ccf8ba7b-3e8e-4b28-9705-26a812edfc07" Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.454286 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" event={"ID":"5c650888-8d3a-4835-bdc0-2686f8881f62","Type":"ContainerStarted","Data":"e9c84f25b345ff837907a90d91391f86a151976700b579602c2e0117988da5e1"} Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.458139 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" event={"ID":"17f04529-6a55-45ae-a999-888e15d5e9d8","Type":"ContainerStarted","Data":"efef072db15097c70ba72661d8ecbb03fe44df0b9bb945bfe4ddf95bb93773d7"} Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.461347 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" event={"ID":"847f8b5e-2233-4fb5-964b-2df8309401d6","Type":"ContainerStarted","Data":"82d49af132e5b35480dd429d0e259d42fdaa080d07037fcf2de3d41e2bb5ed17"} Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.478149 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" event={"ID":"771f6d56-9294-45d0-bf92-5b59be4313bf","Type":"ContainerStarted","Data":"8775a57d0732605adf489feaa079d1a8b00e457aa229aaf435b5c24c1e7cfed6"} Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.480618 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" event={"ID":"8d3f1117-f02f-406b-bae8-84e5e71212c1","Type":"ContainerStarted","Data":"f6be66bb816154c57e2d1273258fd799c3ec9870cdcfffd715aafc5406b3f737"} Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.482813 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" event={"ID":"7cb2d5ac-6334-4e36-9c90-8554ccd85c4f","Type":"ContainerStarted","Data":"f18b05a8f7678e1963c956585580980cf08020751e47dbfae8ad8bf0810f9665"} Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.483907 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" event={"ID":"fdaac0ac-492e-438a-b737-55c88fcf77f1","Type":"ContainerStarted","Data":"8e38d5fe7ed42cd4419720e8ed7a16aca5f8fb766d834cc9536a4a676ac2374f"} Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.486470 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" podUID="8d3f1117-f02f-406b-bae8-84e5e71212c1" Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.670321 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:40 crc kubenswrapper[5008]: I0318 18:20:40.670465 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.670632 5008 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.670675 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:42.670662646 +0000 UTC m=+1099.190135725 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "webhook-server-cert" not found Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.671741 5008 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:20:40 crc kubenswrapper[5008]: E0318 18:20:40.671786 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:42.671775584 +0000 UTC m=+1099.191248663 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "metrics-server-cert" not found Mar 18 18:20:41 crc kubenswrapper[5008]: E0318 18:20:41.491757 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" podUID="ccf8ba7b-3e8e-4b28-9705-26a812edfc07" Mar 18 18:20:41 crc kubenswrapper[5008]: E0318 18:20:41.491942 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" podUID="3b9302e6-1019-49f3-a708-d8552045764e" Mar 18 18:20:41 crc kubenswrapper[5008]: E0318 18:20:41.492546 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" podUID="da6e231d-9d56-4c1c-a10e-b4e258a88c2a" Mar 18 18:20:41 crc kubenswrapper[5008]: E0318 18:20:41.492617 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" podUID="280f2ba7-134f-472b-82f4-d3728bbe6d31" Mar 18 18:20:41 crc kubenswrapper[5008]: E0318 18:20:41.492868 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" podUID="8d3f1117-f02f-406b-bae8-84e5e71212c1" Mar 18 18:20:42 crc kubenswrapper[5008]: I0318 18:20:42.058427 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:42 crc kubenswrapper[5008]: E0318 18:20:42.058652 5008 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:42 crc kubenswrapper[5008]: E0318 18:20:42.058702 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert podName:b78718c8-6ad4-4bc1-ae6d-26bcfbddf493 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:46.058684835 +0000 UTC m=+1102.578157914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert") pod "infra-operator-controller-manager-7b9c774f96-k9grx" (UID: "b78718c8-6ad4-4bc1-ae6d-26bcfbddf493") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:42 crc kubenswrapper[5008]: I0318 18:20:42.393496 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:42 crc kubenswrapper[5008]: E0318 18:20:42.393727 5008 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:42 crc kubenswrapper[5008]: E0318 18:20:42.393798 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert podName:d5e131ed-11af-4026-a4d2-7a25e42e38c9 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:46.393778115 +0000 UTC m=+1102.913251204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mb8mj" (UID: "d5e131ed-11af-4026-a4d2-7a25e42e38c9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:42 crc kubenswrapper[5008]: I0318 18:20:42.782131 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:42 crc kubenswrapper[5008]: I0318 18:20:42.782491 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:42 crc kubenswrapper[5008]: E0318 18:20:42.782753 5008 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:20:42 crc kubenswrapper[5008]: E0318 18:20:42.782758 5008 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:20:42 crc kubenswrapper[5008]: E0318 18:20:42.782804 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:46.782789789 +0000 UTC m=+1103.302262858 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "metrics-server-cert" not found Mar 18 18:20:42 crc kubenswrapper[5008]: E0318 18:20:42.782828 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:46.78281176 +0000 UTC m=+1103.302284839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "webhook-server-cert" not found Mar 18 18:20:46 crc kubenswrapper[5008]: I0318 18:20:46.131013 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:46 crc kubenswrapper[5008]: E0318 18:20:46.131248 5008 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:46 crc kubenswrapper[5008]: E0318 18:20:46.131343 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert podName:b78718c8-6ad4-4bc1-ae6d-26bcfbddf493 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:54.131319576 +0000 UTC m=+1110.650792665 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert") pod "infra-operator-controller-manager-7b9c774f96-k9grx" (UID: "b78718c8-6ad4-4bc1-ae6d-26bcfbddf493") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:46 crc kubenswrapper[5008]: I0318 18:20:46.472156 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:46 crc kubenswrapper[5008]: E0318 18:20:46.472288 5008 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:46 crc kubenswrapper[5008]: E0318 18:20:46.472375 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert podName:d5e131ed-11af-4026-a4d2-7a25e42e38c9 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:54.472353966 +0000 UTC m=+1110.991827135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mb8mj" (UID: "d5e131ed-11af-4026-a4d2-7a25e42e38c9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 18:20:46 crc kubenswrapper[5008]: I0318 18:20:46.877142 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:46 crc kubenswrapper[5008]: I0318 18:20:46.877578 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:46 crc kubenswrapper[5008]: E0318 18:20:46.877370 5008 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:20:46 crc kubenswrapper[5008]: E0318 18:20:46.877800 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:54.877781351 +0000 UTC m=+1111.397254430 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "webhook-server-cert" not found Mar 18 18:20:46 crc kubenswrapper[5008]: E0318 18:20:46.877744 5008 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 18:20:46 crc kubenswrapper[5008]: E0318 18:20:46.878031 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:20:54.877985526 +0000 UTC m=+1111.397458605 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "metrics-server-cert" not found Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.574707 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" event={"ID":"17f04529-6a55-45ae-a999-888e15d5e9d8","Type":"ContainerStarted","Data":"1ee1eec075b589e76c91e58bdf6250b1758d06a62843b143886bd4f046410dfc"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.575296 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.576710 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" event={"ID":"847f8b5e-2233-4fb5-964b-2df8309401d6","Type":"ContainerStarted","Data":"5c728610e50a81f3adbc0df50d2dc22d355bd9d7f7f0cff03232d08f394952d3"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.577268 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.578499 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" event={"ID":"0c6afee9-7e01-426d-beb5-7db66667228e","Type":"ContainerStarted","Data":"0e694719a6344f825902a5dacf4e7cf1f8b7db496adf42f832006ee8287c5653"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.578873 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.580331 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" event={"ID":"852f1249-96d5-4768-b4a5-cba6a81a00a0","Type":"ContainerStarted","Data":"5e42561546ae37ddff4abab13caebee404eaa75c014e8303d86ffcdaf8857905"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.580658 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.582443 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" event={"ID":"53e77882-f2cb-4be8-a8ca-4e9118f30a95","Type":"ContainerStarted","Data":"f170f4464aa9f1ea5c16800bfc6ce4906479aa9254125b2737b817992f08bad8"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.582517 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.584348 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" event={"ID":"91590da1-e8d8-47d0-8737-83b39da9214f","Type":"ContainerStarted","Data":"149f97e73a67036bdcfb4a0148ffc66fe8164574127e65574e19b6c99cf29071"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.584418 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.585882 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" event={"ID":"7cb2d5ac-6334-4e36-9c90-8554ccd85c4f","Type":"ContainerStarted","Data":"40e4178832b7a175a712e3be07251ba8db35f647a8d2e214bc5f0940c0e270bd"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.585993 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.587349 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" event={"ID":"771f6d56-9294-45d0-bf92-5b59be4313bf","Type":"ContainerStarted","Data":"0e7708d6a56812f1acb8b4639425180cdc9c00f6afc8e627de49bb240e9fbfe2"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.587510 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.588679 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" event={"ID":"185ad843-7fab-4ae9-9e83-91c681a93f90","Type":"ContainerStarted","Data":"7d5eccde98081ec5c052e3bd662703fbf5777a5e4043a2f09e9397b252bd0fd1"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.589070 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.590406 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" event={"ID":"d0a49d36-fd45-4ff2-9bb9-f1ccfb048537","Type":"ContainerStarted","Data":"8fc304a0079c12b30193004f3666b3716761d63ac6ce0175b2683bde4962cc44"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.590850 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.592199 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" event={"ID":"5c650888-8d3a-4835-bdc0-2686f8881f62","Type":"ContainerStarted","Data":"3c186d02e59900c71aea59c2f36167c0d83a6903d85fe3a9b79e2dea38f3834f"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.592632 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.593918 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" event={"ID":"1048d1cb-c6cf-47dd-8896-8445f09e6d25","Type":"ContainerStarted","Data":"8d4305bdafe825ffb316c0ab4459adcad697e8f8c0b86f8a295517ed61cc2cb1"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.594225 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.595361 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" event={"ID":"3c744c82-fc71-4e82-8d8c-4f43404a7664","Type":"ContainerStarted","Data":"7a84bbfe01f7242f36ae21dfaab4caa3e2b2dd78d224d60aafdc23ba0442742c"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.595714 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.596740 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" event={"ID":"c60b7b7f-31b3-49da-b7a3-9559345c180b","Type":"ContainerStarted","Data":"f56d4be4d72a156e40c6c780c399c08704fd922bd5611edfebf62b4466857134"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.597066 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.598396 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" event={"ID":"fdaac0ac-492e-438a-b737-55c88fcf77f1","Type":"ContainerStarted","Data":"f5618a0cf7fa09a53d7beab78780dfdfa3647dc078f5d4bb23a6fc60fc9aa626"} Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.598750 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.633516 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" podStartSLOduration=2.539392601 podStartE2EDuration="13.633495837s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.425105763 +0000 UTC m=+1095.944578842" lastFinishedPulling="2026-03-18 18:20:50.519208999 +0000 UTC m=+1107.038682078" observedRunningTime="2026-03-18 18:20:51.604995162 +0000 UTC m=+1108.124468241" watchObservedRunningTime="2026-03-18 18:20:51.633495837 +0000 UTC m=+1108.152968916" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.660947 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" podStartSLOduration=2.974387689 podStartE2EDuration="13.660923606s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.845313059 +0000 UTC m=+1096.364786148" lastFinishedPulling="2026-03-18 18:20:50.531848986 +0000 UTC m=+1107.051322065" observedRunningTime="2026-03-18 18:20:51.656124335 +0000 UTC m=+1108.175597414" watchObservedRunningTime="2026-03-18 18:20:51.660923606 +0000 UTC m=+1108.180396715" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.661362 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" podStartSLOduration=3.394854074 podStartE2EDuration="14.661356977s" podCreationTimestamp="2026-03-18 18:20:37 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.252738907 +0000 UTC m=+1095.772211986" lastFinishedPulling="2026-03-18 18:20:50.51924181 +0000 UTC m=+1107.038714889" observedRunningTime="2026-03-18 18:20:51.636054521 +0000 UTC m=+1108.155527600" watchObservedRunningTime="2026-03-18 18:20:51.661356977 +0000 UTC m=+1108.180830056" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.719643 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" podStartSLOduration=2.698438164 podStartE2EDuration="13.719629179s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.510066386 +0000 UTC m=+1096.029539465" lastFinishedPulling="2026-03-18 18:20:50.531257391 +0000 UTC m=+1107.050730480" observedRunningTime="2026-03-18 18:20:51.717904146 +0000 UTC m=+1108.237377225" watchObservedRunningTime="2026-03-18 18:20:51.719629179 +0000 UTC m=+1108.239102258" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.723060 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" podStartSLOduration=2.474028301 podStartE2EDuration="13.723052935s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.27077823 +0000 UTC m=+1095.790251299" lastFinishedPulling="2026-03-18 18:20:50.519802864 +0000 UTC m=+1107.039275933" observedRunningTime="2026-03-18 18:20:51.685705378 +0000 UTC m=+1108.205178447" watchObservedRunningTime="2026-03-18 18:20:51.723052935 +0000 UTC m=+1108.242526014" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.740489 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" podStartSLOduration=3.478212116 podStartE2EDuration="14.740469552s" podCreationTimestamp="2026-03-18 18:20:37 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.25205366 +0000 UTC m=+1095.771526739" lastFinishedPulling="2026-03-18 18:20:50.514311096 +0000 UTC m=+1107.033784175" observedRunningTime="2026-03-18 18:20:51.739345474 +0000 UTC m=+1108.258818543" watchObservedRunningTime="2026-03-18 18:20:51.740469552 +0000 UTC m=+1108.259942631" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.765895 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" podStartSLOduration=3.398063943 podStartE2EDuration="13.76587655s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.166851441 +0000 UTC m=+1095.686324520" lastFinishedPulling="2026-03-18 18:20:49.534664048 +0000 UTC m=+1106.054137127" observedRunningTime="2026-03-18 18:20:51.76229117 +0000 UTC m=+1108.281764249" watchObservedRunningTime="2026-03-18 18:20:51.76587655 +0000 UTC m=+1108.285349629" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.788944 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" podStartSLOduration=2.64011326 podStartE2EDuration="13.788926329s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.44570375 +0000 UTC m=+1095.965176829" lastFinishedPulling="2026-03-18 18:20:50.594516829 +0000 UTC m=+1107.113989898" observedRunningTime="2026-03-18 18:20:51.786622411 +0000 UTC m=+1108.306095490" watchObservedRunningTime="2026-03-18 18:20:51.788926329 +0000 UTC m=+1108.308399408" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.847649 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" podStartSLOduration=2.817859771 podStartE2EDuration="13.847630832s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.522387015 +0000 UTC m=+1096.041860094" lastFinishedPulling="2026-03-18 18:20:50.552158076 +0000 UTC m=+1107.071631155" observedRunningTime="2026-03-18 18:20:51.845995401 +0000 UTC m=+1108.365468480" watchObservedRunningTime="2026-03-18 18:20:51.847630832 +0000 UTC m=+1108.367103911" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.850308 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" podStartSLOduration=3.374156913 podStartE2EDuration="13.850303459s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.511721997 +0000 UTC m=+1096.031195076" lastFinishedPulling="2026-03-18 18:20:49.987868543 +0000 UTC m=+1106.507341622" observedRunningTime="2026-03-18 18:20:51.826529542 +0000 UTC m=+1108.346002621" watchObservedRunningTime="2026-03-18 18:20:51.850303459 +0000 UTC m=+1108.369776538" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.860368 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" podStartSLOduration=2.884780841 podStartE2EDuration="13.860353381s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.547395843 +0000 UTC m=+1096.066868922" lastFinishedPulling="2026-03-18 18:20:50.522968363 +0000 UTC m=+1107.042441462" observedRunningTime="2026-03-18 18:20:51.856900155 +0000 UTC m=+1108.376373224" watchObservedRunningTime="2026-03-18 18:20:51.860353381 +0000 UTC m=+1108.379826460" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.877724 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" podStartSLOduration=2.977650912 podStartE2EDuration="13.877710667s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.631080383 +0000 UTC m=+1096.150553462" lastFinishedPulling="2026-03-18 18:20:50.531140118 +0000 UTC m=+1107.050613217" observedRunningTime="2026-03-18 18:20:51.87024929 +0000 UTC m=+1108.389722369" watchObservedRunningTime="2026-03-18 18:20:51.877710667 +0000 UTC m=+1108.397183746" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.897810 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" podStartSLOduration=3.414402124 podStartE2EDuration="14.897794551s" podCreationTimestamp="2026-03-18 18:20:37 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.03095791 +0000 UTC m=+1095.550430989" lastFinishedPulling="2026-03-18 18:20:50.514350327 +0000 UTC m=+1107.033823416" observedRunningTime="2026-03-18 18:20:51.892613541 +0000 UTC m=+1108.412086620" watchObservedRunningTime="2026-03-18 18:20:51.897794551 +0000 UTC m=+1108.417267630" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.912182 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" podStartSLOduration=4.1774430559999995 podStartE2EDuration="14.912167312s" podCreationTimestamp="2026-03-18 18:20:37 +0000 UTC" firstStartedPulling="2026-03-18 18:20:38.79986149 +0000 UTC m=+1095.319334569" lastFinishedPulling="2026-03-18 18:20:49.534585746 +0000 UTC m=+1106.054058825" observedRunningTime="2026-03-18 18:20:51.909997567 +0000 UTC m=+1108.429470656" watchObservedRunningTime="2026-03-18 18:20:51.912167312 +0000 UTC m=+1108.431640391" Mar 18 18:20:51 crc kubenswrapper[5008]: I0318 18:20:51.929093 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" podStartSLOduration=4.560641613 podStartE2EDuration="14.929073226s" podCreationTimestamp="2026-03-18 18:20:37 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.166213895 +0000 UTC m=+1095.685686974" lastFinishedPulling="2026-03-18 18:20:49.534645508 +0000 UTC m=+1106.054118587" observedRunningTime="2026-03-18 18:20:51.922581983 +0000 UTC m=+1108.442055062" watchObservedRunningTime="2026-03-18 18:20:51.929073226 +0000 UTC m=+1108.448546305" Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.180411 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:20:54 crc kubenswrapper[5008]: E0318 18:20:54.180639 5008 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:54 crc kubenswrapper[5008]: E0318 18:20:54.181215 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert podName:b78718c8-6ad4-4bc1-ae6d-26bcfbddf493 nodeName:}" failed. No retries permitted until 2026-03-18 18:21:10.181186863 +0000 UTC m=+1126.700659952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert") pod "infra-operator-controller-manager-7b9c774f96-k9grx" (UID: "b78718c8-6ad4-4bc1-ae6d-26bcfbddf493") : secret "infra-operator-webhook-server-cert" not found Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.460198 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.460249 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.485267 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.496638 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5e131ed-11af-4026-a4d2-7a25e42e38c9-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mb8mj\" (UID: \"d5e131ed-11af-4026-a4d2-7a25e42e38c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.592543 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.635132 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" event={"ID":"3b9302e6-1019-49f3-a708-d8552045764e","Type":"ContainerStarted","Data":"524665a4a6299c8cf11ad0d08255a9b27ec3d36deb2af00e5ef609063b128019"} Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.635388 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.657518 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" podStartSLOduration=2.330250762 podStartE2EDuration="16.657498948s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.761970718 +0000 UTC m=+1096.281443797" lastFinishedPulling="2026-03-18 18:20:54.089218904 +0000 UTC m=+1110.608691983" observedRunningTime="2026-03-18 18:20:54.653249141 +0000 UTC m=+1111.172722240" watchObservedRunningTime="2026-03-18 18:20:54.657498948 +0000 UTC m=+1111.176972037" Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.890629 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.890696 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:54 crc kubenswrapper[5008]: E0318 18:20:54.890855 5008 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 18:20:54 crc kubenswrapper[5008]: E0318 18:20:54.890939 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs podName:002df576-b6e1-4ffd-9eda-5751dcf89505 nodeName:}" failed. No retries permitted until 2026-03-18 18:21:10.890916897 +0000 UTC m=+1127.410389986 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-twfsv" (UID: "002df576-b6e1-4ffd-9eda-5751dcf89505") : secret "webhook-server-cert" not found Mar 18 18:20:54 crc kubenswrapper[5008]: I0318 18:20:54.896728 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:20:55 crc kubenswrapper[5008]: I0318 18:20:55.171487 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj"] Mar 18 18:20:55 crc kubenswrapper[5008]: I0318 18:20:55.644282 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" event={"ID":"d5e131ed-11af-4026-a4d2-7a25e42e38c9","Type":"ContainerStarted","Data":"49c3d06776b992bb673d5e3d0515acf25f947e596d7ef786e7d73e20c37f549d"} Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.278785 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-cgg7x" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.364111 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-fbwjk" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.364588 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-wn4dr" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.373131 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-thfzw" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.394987 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-x9hn5" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.451532 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-stnqj" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.482775 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4ctxd" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.505483 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-7fvgk" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.680954 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-lcvlv" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.794736 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-thmph" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.873328 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-6pf8x" Mar 18 18:20:58 crc kubenswrapper[5008]: I0318 18:20:58.905254 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9w9g2" Mar 18 18:20:59 crc kubenswrapper[5008]: I0318 18:20:59.022546 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-x5ghn" Mar 18 18:20:59 crc kubenswrapper[5008]: I0318 18:20:59.077856 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hg8kc" Mar 18 18:20:59 crc kubenswrapper[5008]: I0318 18:20:59.099512 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-rjdmt" Mar 18 18:20:59 crc kubenswrapper[5008]: I0318 18:20:59.172930 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-2v4j6" Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.683720 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" event={"ID":"d5e131ed-11af-4026-a4d2-7a25e42e38c9","Type":"ContainerStarted","Data":"7efdeed3b379565fef826e1c2ed1b1ae563bc8ca046ac126e8e8f30f10ef4997"} Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.685382 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.685790 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" event={"ID":"8d3f1117-f02f-406b-bae8-84e5e71212c1","Type":"ContainerStarted","Data":"4403d10975167c57ef83715fe3a7edf39738dda482b5a694a5b546367ff4d476"} Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.688035 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" event={"ID":"280f2ba7-134f-472b-82f4-d3728bbe6d31","Type":"ContainerStarted","Data":"75ed40413dc2292a93870e3f16c65ac3386a6485c0f204dd4c4f80fecac143d8"} Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.689374 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" event={"ID":"ccf8ba7b-3e8e-4b28-9705-26a812edfc07","Type":"ContainerStarted","Data":"cf0874077b9b43fcd82dd6bd2854cc0fde6e87ec2e4bcc51ff03cd275001ce5a"} Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.689488 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.690839 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" event={"ID":"da6e231d-9d56-4c1c-a10e-b4e258a88c2a","Type":"ContainerStarted","Data":"e8bf25c292a2a6438f15169df389d56e144987a436605dee794631f9b6614d1d"} Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.691073 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.709610 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" podStartSLOduration=18.451162868 podStartE2EDuration="22.709594473s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:55.178062484 +0000 UTC m=+1111.697535563" lastFinishedPulling="2026-03-18 18:20:59.436494099 +0000 UTC m=+1115.955967168" observedRunningTime="2026-03-18 18:21:00.707664255 +0000 UTC m=+1117.227137334" watchObservedRunningTime="2026-03-18 18:21:00.709594473 +0000 UTC m=+1117.229067552" Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.729736 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" podStartSLOduration=3.157363172 podStartE2EDuration="22.729711548s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.851825353 +0000 UTC m=+1096.371298442" lastFinishedPulling="2026-03-18 18:20:59.424173739 +0000 UTC m=+1115.943646818" observedRunningTime="2026-03-18 18:21:00.723703887 +0000 UTC m=+1117.243176966" watchObservedRunningTime="2026-03-18 18:21:00.729711548 +0000 UTC m=+1117.249184627" Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.745599 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" podStartSLOduration=2.825153234 podStartE2EDuration="22.745577676s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.547769392 +0000 UTC m=+1096.067242471" lastFinishedPulling="2026-03-18 18:20:59.468193834 +0000 UTC m=+1115.987666913" observedRunningTime="2026-03-18 18:21:00.73772787 +0000 UTC m=+1117.257200959" watchObservedRunningTime="2026-03-18 18:21:00.745577676 +0000 UTC m=+1117.265050765" Mar 18 18:21:00 crc kubenswrapper[5008]: I0318 18:21:00.764042 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nlftd" podStartSLOduration=3.024808135 podStartE2EDuration="22.76402229s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.694765391 +0000 UTC m=+1096.214238470" lastFinishedPulling="2026-03-18 18:20:59.433979526 +0000 UTC m=+1115.953452625" observedRunningTime="2026-03-18 18:21:00.763281101 +0000 UTC m=+1117.282754190" watchObservedRunningTime="2026-03-18 18:21:00.76402229 +0000 UTC m=+1117.283495379" Mar 18 18:21:04 crc kubenswrapper[5008]: I0318 18:21:04.599866 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mb8mj" Mar 18 18:21:04 crc kubenswrapper[5008]: I0318 18:21:04.640878 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" podStartSLOduration=7.011682874 podStartE2EDuration="26.640846896s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:20:39.743653728 +0000 UTC m=+1096.263126807" lastFinishedPulling="2026-03-18 18:20:59.37281776 +0000 UTC m=+1115.892290829" observedRunningTime="2026-03-18 18:21:00.782199436 +0000 UTC m=+1117.301672515" watchObservedRunningTime="2026-03-18 18:21:04.640846896 +0000 UTC m=+1121.160319995" Mar 18 18:21:08 crc kubenswrapper[5008]: I0318 18:21:08.906994 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-k8vzn" Mar 18 18:21:09 crc kubenswrapper[5008]: I0318 18:21:09.133475 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" Mar 18 18:21:09 crc kubenswrapper[5008]: I0318 18:21:09.135629 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-kqmf8" Mar 18 18:21:09 crc kubenswrapper[5008]: I0318 18:21:09.144416 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-g6vj6" Mar 18 18:21:10 crc kubenswrapper[5008]: I0318 18:21:10.189315 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:21:10 crc kubenswrapper[5008]: I0318 18:21:10.197548 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b78718c8-6ad4-4bc1-ae6d-26bcfbddf493-cert\") pod \"infra-operator-controller-manager-7b9c774f96-k9grx\" (UID: \"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:21:10 crc kubenswrapper[5008]: I0318 18:21:10.262727 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:21:10 crc kubenswrapper[5008]: I0318 18:21:10.697486 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx"] Mar 18 18:21:10 crc kubenswrapper[5008]: W0318 18:21:10.703470 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb78718c8_6ad4_4bc1_ae6d_26bcfbddf493.slice/crio-acb06ca3f77680d848a9fdcd8191a89d4a53e28d4e9624c077086642db5cae9b WatchSource:0}: Error finding container acb06ca3f77680d848a9fdcd8191a89d4a53e28d4e9624c077086642db5cae9b: Status 404 returned error can't find the container with id acb06ca3f77680d848a9fdcd8191a89d4a53e28d4e9624c077086642db5cae9b Mar 18 18:21:10 crc kubenswrapper[5008]: I0318 18:21:10.769401 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" event={"ID":"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493","Type":"ContainerStarted","Data":"acb06ca3f77680d848a9fdcd8191a89d4a53e28d4e9624c077086642db5cae9b"} Mar 18 18:21:10 crc kubenswrapper[5008]: I0318 18:21:10.899705 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:21:10 crc kubenswrapper[5008]: I0318 18:21:10.904258 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/002df576-b6e1-4ffd-9eda-5751dcf89505-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-twfsv\" (UID: \"002df576-b6e1-4ffd-9eda-5751dcf89505\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:21:10 crc kubenswrapper[5008]: I0318 18:21:10.996490 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:21:11 crc kubenswrapper[5008]: I0318 18:21:11.268732 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv"] Mar 18 18:21:11 crc kubenswrapper[5008]: I0318 18:21:11.776832 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" event={"ID":"002df576-b6e1-4ffd-9eda-5751dcf89505","Type":"ContainerStarted","Data":"57bdfbb36ebb3e26f5a1d2c9557e82c697671b2a1b070adee2083d42443dfe20"} Mar 18 18:21:13 crc kubenswrapper[5008]: I0318 18:21:13.792447 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" event={"ID":"002df576-b6e1-4ffd-9eda-5751dcf89505","Type":"ContainerStarted","Data":"1319b8a2e6095fa2417252eda625e45bdfc16dba9fde999bedcef7f7ed062b90"} Mar 18 18:21:14 crc kubenswrapper[5008]: I0318 18:21:14.803213 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:21:14 crc kubenswrapper[5008]: I0318 18:21:14.858444 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" podStartSLOduration=36.858408871 podStartE2EDuration="36.858408871s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:21:14.854188705 +0000 UTC m=+1131.373661804" watchObservedRunningTime="2026-03-18 18:21:14.858408871 +0000 UTC m=+1131.377881990" Mar 18 18:21:16 crc kubenswrapper[5008]: I0318 18:21:16.819830 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" event={"ID":"b78718c8-6ad4-4bc1-ae6d-26bcfbddf493","Type":"ContainerStarted","Data":"35f4211184fb595cdad9b5af8f3b07a87e414c142523c4cf10a4785711abf420"} Mar 18 18:21:16 crc kubenswrapper[5008]: I0318 18:21:16.820163 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:21:16 crc kubenswrapper[5008]: I0318 18:21:16.853629 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" podStartSLOduration=33.505152677 podStartE2EDuration="38.853593448s" podCreationTimestamp="2026-03-18 18:20:38 +0000 UTC" firstStartedPulling="2026-03-18 18:21:10.70844538 +0000 UTC m=+1127.227918449" lastFinishedPulling="2026-03-18 18:21:16.056886141 +0000 UTC m=+1132.576359220" observedRunningTime="2026-03-18 18:21:16.845606248 +0000 UTC m=+1133.365079397" watchObservedRunningTime="2026-03-18 18:21:16.853593448 +0000 UTC m=+1133.373066577" Mar 18 18:21:21 crc kubenswrapper[5008]: I0318 18:21:21.008856 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-twfsv" Mar 18 18:21:24 crc kubenswrapper[5008]: I0318 18:21:24.460760 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:21:24 crc kubenswrapper[5008]: I0318 18:21:24.461739 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:21:30 crc kubenswrapper[5008]: I0318 18:21:30.270340 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-k9grx" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.314769 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-4n6fm"] Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.328820 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.336081 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.336667 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.337238 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.338303 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jhv6w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.341583 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-4n6fm"] Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.418108 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-lvh9w"] Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.424096 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.426781 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.429172 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v7tt\" (UniqueName: \"kubernetes.io/projected/17dd6da7-ebe7-4b74-a4df-19c6dec82210-kube-api-access-6v7tt\") pod \"dnsmasq-dns-5448ff6dc7-4n6fm\" (UID: \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\") " pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.429306 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd6da7-ebe7-4b74-a4df-19c6dec82210-config\") pod \"dnsmasq-dns-5448ff6dc7-4n6fm\" (UID: \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\") " pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.434004 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-lvh9w"] Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.530857 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v7tt\" (UniqueName: \"kubernetes.io/projected/17dd6da7-ebe7-4b74-a4df-19c6dec82210-kube-api-access-6v7tt\") pod \"dnsmasq-dns-5448ff6dc7-4n6fm\" (UID: \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\") " pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.530933 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd6da7-ebe7-4b74-a4df-19c6dec82210-config\") pod \"dnsmasq-dns-5448ff6dc7-4n6fm\" (UID: \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\") " pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.530986 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-config\") pod \"dnsmasq-dns-64696987c5-lvh9w\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.531053 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rctxd\" (UniqueName: \"kubernetes.io/projected/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-kube-api-access-rctxd\") pod \"dnsmasq-dns-64696987c5-lvh9w\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.531073 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-dns-svc\") pod \"dnsmasq-dns-64696987c5-lvh9w\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.532199 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd6da7-ebe7-4b74-a4df-19c6dec82210-config\") pod \"dnsmasq-dns-5448ff6dc7-4n6fm\" (UID: \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\") " pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.549319 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v7tt\" (UniqueName: \"kubernetes.io/projected/17dd6da7-ebe7-4b74-a4df-19c6dec82210-kube-api-access-6v7tt\") pod \"dnsmasq-dns-5448ff6dc7-4n6fm\" (UID: \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\") " pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.632279 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-config\") pod \"dnsmasq-dns-64696987c5-lvh9w\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.632720 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rctxd\" (UniqueName: \"kubernetes.io/projected/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-kube-api-access-rctxd\") pod \"dnsmasq-dns-64696987c5-lvh9w\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.632923 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-dns-svc\") pod \"dnsmasq-dns-64696987c5-lvh9w\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.633103 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-config\") pod \"dnsmasq-dns-64696987c5-lvh9w\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.633821 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-dns-svc\") pod \"dnsmasq-dns-64696987c5-lvh9w\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.659885 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rctxd\" (UniqueName: \"kubernetes.io/projected/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-kube-api-access-rctxd\") pod \"dnsmasq-dns-64696987c5-lvh9w\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.667696 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:21:46 crc kubenswrapper[5008]: I0318 18:21:46.736418 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:21:47 crc kubenswrapper[5008]: I0318 18:21:47.012722 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-lvh9w"] Mar 18 18:21:47 crc kubenswrapper[5008]: I0318 18:21:47.088713 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-lvh9w" event={"ID":"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5","Type":"ContainerStarted","Data":"811e2ff68fa478258e1c4e50442b1168d5fc901f6e84760457134a7a0282c5e1"} Mar 18 18:21:47 crc kubenswrapper[5008]: I0318 18:21:47.098803 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-4n6fm"] Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.117843 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" event={"ID":"17dd6da7-ebe7-4b74-a4df-19c6dec82210","Type":"ContainerStarted","Data":"1011ea9428c39e97dcc752018e40935584f58e6af7606effd3cc5ff4cabe4502"} Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.210250 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-4n6fm"] Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.234728 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-qqz54"] Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.236051 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.254889 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-qqz54"] Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.360820 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-qqz54\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.360947 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-config\") pod \"dnsmasq-dns-854f47b4f9-qqz54\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.360992 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tjvq\" (UniqueName: \"kubernetes.io/projected/dc9d239d-b8eb-4e7a-a630-52617586149e-kube-api-access-2tjvq\") pod \"dnsmasq-dns-854f47b4f9-qqz54\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.462117 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tjvq\" (UniqueName: \"kubernetes.io/projected/dc9d239d-b8eb-4e7a-a630-52617586149e-kube-api-access-2tjvq\") pod \"dnsmasq-dns-854f47b4f9-qqz54\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.462195 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-qqz54\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.462272 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-config\") pod \"dnsmasq-dns-854f47b4f9-qqz54\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.463207 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-dns-svc\") pod \"dnsmasq-dns-854f47b4f9-qqz54\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.463283 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-config\") pod \"dnsmasq-dns-854f47b4f9-qqz54\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.494782 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tjvq\" (UniqueName: \"kubernetes.io/projected/dc9d239d-b8eb-4e7a-a630-52617586149e-kube-api-access-2tjvq\") pod \"dnsmasq-dns-854f47b4f9-qqz54\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:48 crc kubenswrapper[5008]: I0318 18:21:48.555999 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.083578 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-qqz54"] Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.126385 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" event={"ID":"dc9d239d-b8eb-4e7a-a630-52617586149e","Type":"ContainerStarted","Data":"0256b436e152155e6c17ada2b8cc51c705b70346777fe114ad5cb1cc704384d4"} Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.281668 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-lvh9w"] Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.307731 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-4kgnr"] Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.308919 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.314848 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-4kgnr"] Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.370586 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.371632 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.374641 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.374730 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.374979 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.375126 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gplck" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.375440 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.375605 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.375748 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.396216 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485065 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485106 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485133 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485161 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485178 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485198 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-4kgnr\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485213 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485236 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-config\") pod \"dnsmasq-dns-54b5dffb47-4kgnr\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485270 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485289 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485303 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4nk\" (UniqueName: \"kubernetes.io/projected/af39f6a2-816f-4c55-952b-77a69da86828-kube-api-access-8z4nk\") pod \"dnsmasq-dns-54b5dffb47-4kgnr\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485318 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485332 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.485347 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghgp5\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-kube-api-access-ghgp5\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587020 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-config\") pod \"dnsmasq-dns-54b5dffb47-4kgnr\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587117 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587152 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587177 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4nk\" (UniqueName: \"kubernetes.io/projected/af39f6a2-816f-4c55-952b-77a69da86828-kube-api-access-8z4nk\") pod \"dnsmasq-dns-54b5dffb47-4kgnr\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587200 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587221 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587242 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghgp5\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-kube-api-access-ghgp5\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587285 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587311 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587339 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587373 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587394 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587421 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-4kgnr\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.587445 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.588209 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-config\") pod \"dnsmasq-dns-54b5dffb47-4kgnr\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.588622 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.589129 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.589214 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-4kgnr\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.589610 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.589848 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.590122 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.590832 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.597118 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.597352 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.598423 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.601644 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.604421 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4nk\" (UniqueName: \"kubernetes.io/projected/af39f6a2-816f-4c55-952b-77a69da86828-kube-api-access-8z4nk\") pod \"dnsmasq-dns-54b5dffb47-4kgnr\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.606292 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghgp5\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-kube-api-access-ghgp5\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.613811 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.634155 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.701021 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 18:21:49 crc kubenswrapper[5008]: I0318 18:21:49.897807 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-4kgnr"] Mar 18 18:21:49 crc kubenswrapper[5008]: W0318 18:21:49.912707 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf39f6a2_816f_4c55_952b_77a69da86828.slice/crio-263ac0d9acbc0baf7c7fc5dd1f964337609f6fe152cc9335a4d7dad7736aee5d WatchSource:0}: Error finding container 263ac0d9acbc0baf7c7fc5dd1f964337609f6fe152cc9335a4d7dad7736aee5d: Status 404 returned error can't find the container with id 263ac0d9acbc0baf7c7fc5dd1f964337609f6fe152cc9335a4d7dad7736aee5d Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.137618 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" event={"ID":"af39f6a2-816f-4c55-952b-77a69da86828","Type":"ContainerStarted","Data":"263ac0d9acbc0baf7c7fc5dd1f964337609f6fe152cc9335a4d7dad7736aee5d"} Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.241923 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:21:50 crc kubenswrapper[5008]: W0318 18:21:50.246047 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5f0191_2702_46ed_ab82_e8c93ec1cf02.slice/crio-b69518ed40ab043502b3456686e18f20d80a4f35f6635ec351a6747e9c15a891 WatchSource:0}: Error finding container b69518ed40ab043502b3456686e18f20d80a4f35f6635ec351a6747e9c15a891: Status 404 returned error can't find the container with id b69518ed40ab043502b3456686e18f20d80a4f35f6635ec351a6747e9c15a891 Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.411546 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.414356 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.420242 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.420483 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.421395 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xlqz9" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.421468 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.421611 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.421683 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.421756 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.421828 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.509523 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.509808 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.509832 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.509849 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.509879 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.509914 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.510022 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b60d757b-db66-46c1-ad92-4a9e591217a0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.510131 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.510174 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.510237 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4wzt\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-kube-api-access-k4wzt\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.510316 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b60d757b-db66-46c1-ad92-4a9e591217a0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611547 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611635 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611666 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b60d757b-db66-46c1-ad92-4a9e591217a0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611680 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611696 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611710 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4wzt\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-kube-api-access-k4wzt\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611741 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b60d757b-db66-46c1-ad92-4a9e591217a0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611786 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611804 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611827 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.611846 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.612293 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.613004 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.614398 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.614444 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.615178 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.615513 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.624271 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b60d757b-db66-46c1-ad92-4a9e591217a0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.624311 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.624752 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b60d757b-db66-46c1-ad92-4a9e591217a0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.624866 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.633619 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4wzt\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-kube-api-access-k4wzt\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.640471 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:50 crc kubenswrapper[5008]: I0318 18:21:50.758607 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.171601 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3d5f0191-2702-46ed-ab82-e8c93ec1cf02","Type":"ContainerStarted","Data":"b69518ed40ab043502b3456686e18f20d80a4f35f6635ec351a6747e9c15a891"} Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.265471 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.488511 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.490280 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.494884 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-rjgg5" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.495641 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.496098 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.496674 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.502140 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.503844 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.671093 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.671221 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kolla-config\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.671242 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-default\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.671297 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.671334 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.671390 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.671412 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsbt6\" (UniqueName: \"kubernetes.io/projected/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kube-api-access-hsbt6\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.671530 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.772710 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.773062 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kolla-config\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.773129 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-default\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.773155 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.773322 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.773343 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.773364 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsbt6\" (UniqueName: \"kubernetes.io/projected/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kube-api-access-hsbt6\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.773381 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.773719 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.773895 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.774260 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kolla-config\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.774307 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-default\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.775293 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.778451 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.783521 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.790141 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsbt6\" (UniqueName: \"kubernetes.io/projected/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kube-api-access-hsbt6\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.801179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " pod="openstack/openstack-galera-0" Mar 18 18:21:51 crc kubenswrapper[5008]: I0318 18:21:51.816851 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 18:21:52 crc kubenswrapper[5008]: I0318 18:21:52.920701 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:21:52 crc kubenswrapper[5008]: I0318 18:21:52.921748 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:52 crc kubenswrapper[5008]: I0318 18:21:52.923686 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 18:21:52 crc kubenswrapper[5008]: I0318 18:21:52.924398 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 18:21:52 crc kubenswrapper[5008]: I0318 18:21:52.925597 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 18:21:52 crc kubenswrapper[5008]: I0318 18:21:52.925885 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dkzfs" Mar 18 18:21:52 crc kubenswrapper[5008]: I0318 18:21:52.933955 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.101833 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.101904 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.101932 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.101959 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.102005 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.102055 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.102088 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb45v\" (UniqueName: \"kubernetes.io/projected/07bd6644-ca18-4b8d-ad83-9757257768fb-kube-api-access-zb45v\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.102125 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.203285 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.203340 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.203358 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.203377 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.203412 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.203447 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.203470 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb45v\" (UniqueName: \"kubernetes.io/projected/07bd6644-ca18-4b8d-ad83-9757257768fb-kube-api-access-zb45v\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.203519 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.204263 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.204738 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.205177 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.206382 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.206453 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.210079 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.210357 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.218226 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb45v\" (UniqueName: \"kubernetes.io/projected/07bd6644-ca18-4b8d-ad83-9757257768fb-kube-api-access-zb45v\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.257630 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.312525 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.313629 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.315493 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xlcv7" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.315851 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.322474 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.326468 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.405542 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.405938 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhtg\" (UniqueName: \"kubernetes.io/projected/6cd78c73-6590-4035-af7d-357b8451f0ad-kube-api-access-kjhtg\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.405962 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.406021 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-config-data\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.406067 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-kolla-config\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.507365 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.507404 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhtg\" (UniqueName: \"kubernetes.io/projected/6cd78c73-6590-4035-af7d-357b8451f0ad-kube-api-access-kjhtg\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.507421 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.507472 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-config-data\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.507520 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-kolla-config\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.508325 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-kolla-config\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.508433 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-config-data\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.514275 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.515545 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.534991 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhtg\" (UniqueName: \"kubernetes.io/projected/6cd78c73-6590-4035-af7d-357b8451f0ad-kube-api-access-kjhtg\") pod \"memcached-0\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " pod="openstack/memcached-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.554067 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 18:21:53 crc kubenswrapper[5008]: I0318 18:21:53.646406 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 18:21:54 crc kubenswrapper[5008]: I0318 18:21:54.460178 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:21:54 crc kubenswrapper[5008]: I0318 18:21:54.460243 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:21:54 crc kubenswrapper[5008]: I0318 18:21:54.460293 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:21:54 crc kubenswrapper[5008]: I0318 18:21:54.460940 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd022c5c3ebfc1487f31b5991ba1ecc58d2f77c9bf3db917b976667648d3cca3"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:21:54 crc kubenswrapper[5008]: I0318 18:21:54.461002 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://fd022c5c3ebfc1487f31b5991ba1ecc58d2f77c9bf3db917b976667648d3cca3" gracePeriod=600 Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.215297 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b60d757b-db66-46c1-ad92-4a9e591217a0","Type":"ContainerStarted","Data":"44096718cbe5c33667ee937bbb0cce30deb092bb95ba0d99985feb6af80d0f06"} Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.218811 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="fd022c5c3ebfc1487f31b5991ba1ecc58d2f77c9bf3db917b976667648d3cca3" exitCode=0 Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.218857 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"fd022c5c3ebfc1487f31b5991ba1ecc58d2f77c9bf3db917b976667648d3cca3"} Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.218891 5008 scope.go:117] "RemoveContainer" containerID="f98223e188c7e180bb9c16b9b888a18eaae99967d91bf2ff048b12e80fd84a1c" Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.308607 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.309497 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.311182 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wcclg" Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.358062 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.443134 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5qms\" (UniqueName: \"kubernetes.io/projected/96ede22a-8990-49f1-8bb9-f0c08bb3c8b3-kube-api-access-w5qms\") pod \"kube-state-metrics-0\" (UID: \"96ede22a-8990-49f1-8bb9-f0c08bb3c8b3\") " pod="openstack/kube-state-metrics-0" Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.544669 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5qms\" (UniqueName: \"kubernetes.io/projected/96ede22a-8990-49f1-8bb9-f0c08bb3c8b3-kube-api-access-w5qms\") pod \"kube-state-metrics-0\" (UID: \"96ede22a-8990-49f1-8bb9-f0c08bb3c8b3\") " pod="openstack/kube-state-metrics-0" Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.566032 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5qms\" (UniqueName: \"kubernetes.io/projected/96ede22a-8990-49f1-8bb9-f0c08bb3c8b3-kube-api-access-w5qms\") pod \"kube-state-metrics-0\" (UID: \"96ede22a-8990-49f1-8bb9-f0c08bb3c8b3\") " pod="openstack/kube-state-metrics-0" Mar 18 18:21:55 crc kubenswrapper[5008]: I0318 18:21:55.628529 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:21:59 crc kubenswrapper[5008]: I0318 18:21:59.911399 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9qcqj"] Mar 18 18:21:59 crc kubenswrapper[5008]: I0318 18:21:59.914714 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj" Mar 18 18:21:59 crc kubenswrapper[5008]: I0318 18:21:59.918268 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 18:21:59 crc kubenswrapper[5008]: I0318 18:21:59.918532 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-w8ftn" Mar 18 18:21:59 crc kubenswrapper[5008]: I0318 18:21:59.918724 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 18:21:59 crc kubenswrapper[5008]: I0318 18:21:59.932197 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9qcqj"] Mar 18 18:21:59 crc kubenswrapper[5008]: I0318 18:21:59.946173 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-x8pkm"] Mar 18 18:21:59 crc kubenswrapper[5008]: I0318 18:21:59.948073 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:21:59 crc kubenswrapper[5008]: I0318 18:21:59.966370 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-x8pkm"] Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.016032 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-scripts\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.016092 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.016144 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run-ovn\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.016169 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-combined-ca-bundle\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.016204 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-ovn-controller-tls-certs\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.016223 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-log-ovn\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.016244 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7gtp\" (UniqueName: \"kubernetes.io/projected/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-kube-api-access-q7gtp\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117678 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-combined-ca-bundle\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117737 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-478tc\" (UniqueName: \"kubernetes.io/projected/f55031bd-9626-475f-a74f-d0e5f8ec8a66-kube-api-access-478tc\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117780 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-lib\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117806 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-ovn-controller-tls-certs\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117833 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-log-ovn\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117860 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7gtp\" (UniqueName: \"kubernetes.io/projected/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-kube-api-access-q7gtp\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117895 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f55031bd-9626-475f-a74f-d0e5f8ec8a66-scripts\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117930 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-log\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117954 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-scripts\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.117988 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.118012 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-etc-ovs\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.118040 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-run\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.118084 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run-ovn\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.118545 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run-ovn\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.119115 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-log-ovn\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.119515 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.122204 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-scripts\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.143224 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-ovn-controller-tls-certs\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.146535 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564302-ht8rz"] Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.147596 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-ht8rz" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.149114 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7gtp\" (UniqueName: \"kubernetes.io/projected/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-kube-api-access-q7gtp\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.150079 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.152053 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.152193 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.153614 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-combined-ca-bundle\") pod \"ovn-controller-9qcqj\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.160633 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-ht8rz"] Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219222 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-etc-ovs\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219266 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-run\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219327 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-478tc\" (UniqueName: \"kubernetes.io/projected/f55031bd-9626-475f-a74f-d0e5f8ec8a66-kube-api-access-478tc\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219355 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-lib\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219397 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f55031bd-9626-475f-a74f-d0e5f8ec8a66-scripts\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219425 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-log\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219654 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-log\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219705 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-run\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219703 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-etc-ovs\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.219848 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-lib\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.222119 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f55031bd-9626-475f-a74f-d0e5f8ec8a66-scripts\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.235725 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-478tc\" (UniqueName: \"kubernetes.io/projected/f55031bd-9626-475f-a74f-d0e5f8ec8a66-kube-api-access-478tc\") pod \"ovn-controller-ovs-x8pkm\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.238935 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.272761 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.321945 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxv9g\" (UniqueName: \"kubernetes.io/projected/236a7691-d006-4de5-bd0e-36d58d0e02b4-kube-api-access-nxv9g\") pod \"auto-csr-approver-29564302-ht8rz\" (UID: \"236a7691-d006-4de5-bd0e-36d58d0e02b4\") " pod="openshift-infra/auto-csr-approver-29564302-ht8rz" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.425380 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxv9g\" (UniqueName: \"kubernetes.io/projected/236a7691-d006-4de5-bd0e-36d58d0e02b4-kube-api-access-nxv9g\") pod \"auto-csr-approver-29564302-ht8rz\" (UID: \"236a7691-d006-4de5-bd0e-36d58d0e02b4\") " pod="openshift-infra/auto-csr-approver-29564302-ht8rz" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.447596 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxv9g\" (UniqueName: \"kubernetes.io/projected/236a7691-d006-4de5-bd0e-36d58d0e02b4-kube-api-access-nxv9g\") pod \"auto-csr-approver-29564302-ht8rz\" (UID: \"236a7691-d006-4de5-bd0e-36d58d0e02b4\") " pod="openshift-infra/auto-csr-approver-29564302-ht8rz" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.502390 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-ht8rz" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.819755 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.821051 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.830429 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.830651 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fk6xl" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.831685 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.832265 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.832470 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.833121 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.935797 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.935859 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.935899 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.935921 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-config\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.935946 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.935963 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvcqt\" (UniqueName: \"kubernetes.io/projected/defaf26d-efb3-4ab4-96fb-fe8826988fe1-kube-api-access-hvcqt\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.935979 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:00 crc kubenswrapper[5008]: I0318 18:22:00.936015 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.037037 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.037330 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.037436 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.037543 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.038065 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-config\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.038182 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.038257 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvcqt\" (UniqueName: \"kubernetes.io/projected/defaf26d-efb3-4ab4-96fb-fe8826988fe1-kube-api-access-hvcqt\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.038338 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.037918 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.038800 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.039426 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-config\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.040325 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.043306 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.047104 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.048250 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.064268 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.065270 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvcqt\" (UniqueName: \"kubernetes.io/projected/defaf26d-efb3-4ab4-96fb-fe8826988fe1-kube-api-access-hvcqt\") pod \"ovsdbserver-nb-0\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:01 crc kubenswrapper[5008]: I0318 18:22:01.148626 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.099221 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.100598 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.107617 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.107674 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.107969 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.108049 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-zgq47" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.125634 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.270517 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.270584 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.270731 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bltdw\" (UniqueName: \"kubernetes.io/projected/4edb3df2-7960-412a-ba0f-32bd8fdabc86-kube-api-access-bltdw\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.270863 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.270971 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.271034 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.271095 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-config\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.271124 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.372135 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.372197 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-config\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.372218 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.372284 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.372301 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.372331 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bltdw\" (UniqueName: \"kubernetes.io/projected/4edb3df2-7960-412a-ba0f-32bd8fdabc86-kube-api-access-bltdw\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.372359 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.372394 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.372701 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.373488 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-config\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.373879 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.374147 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.384302 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.384460 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.384466 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.392549 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bltdw\" (UniqueName: \"kubernetes.io/projected/4edb3df2-7960-412a-ba0f-32bd8fdabc86-kube-api-access-bltdw\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.394254 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:03 crc kubenswrapper[5008]: I0318 18:22:03.420599 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:04 crc kubenswrapper[5008]: E0318 18:22:04.799461 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 18:22:04 crc kubenswrapper[5008]: E0318 18:22:04.799642 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6v7tt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-4n6fm_openstack(17dd6da7-ebe7-4b74-a4df-19c6dec82210): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:22:04 crc kubenswrapper[5008]: E0318 18:22:04.800984 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" podUID="17dd6da7-ebe7-4b74-a4df-19c6dec82210" Mar 18 18:22:04 crc kubenswrapper[5008]: E0318 18:22:04.828427 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 18:22:04 crc kubenswrapper[5008]: E0318 18:22:04.828583 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rctxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-lvh9w_openstack(d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:22:04 crc kubenswrapper[5008]: E0318 18:22:04.829873 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-lvh9w" podUID="d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5" Mar 18 18:22:04 crc kubenswrapper[5008]: E0318 18:22:04.835262 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 18:22:04 crc kubenswrapper[5008]: E0318 18:22:04.835380 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tjvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-854f47b4f9-qqz54_openstack(dc9d239d-b8eb-4e7a-a630-52617586149e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:22:04 crc kubenswrapper[5008]: E0318 18:22:04.838057 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" podUID="dc9d239d-b8eb-4e7a-a630-52617586149e" Mar 18 18:22:05 crc kubenswrapper[5008]: E0318 18:22:05.339342 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" podUID="dc9d239d-b8eb-4e7a-a630-52617586149e" Mar 18 18:22:09 crc kubenswrapper[5008]: E0318 18:22:09.158785 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d" Mar 18 18:22:09 crc kubenswrapper[5008]: E0318 18:22:09.159324 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghgp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(3d5f0191-2702-46ed-ab82-e8c93ec1cf02): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:22:09 crc kubenswrapper[5008]: E0318 18:22:09.161157 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" Mar 18 18:22:09 crc kubenswrapper[5008]: E0318 18:22:09.201024 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 18:22:09 crc kubenswrapper[5008]: E0318 18:22:09.201497 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8z4nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-54b5dffb47-4kgnr_openstack(af39f6a2-816f-4c55-952b-77a69da86828): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:22:09 crc kubenswrapper[5008]: E0318 18:22:09.202932 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" podUID="af39f6a2-816f-4c55-952b-77a69da86828" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.367976 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.373104 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.420247 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" event={"ID":"17dd6da7-ebe7-4b74-a4df-19c6dec82210","Type":"ContainerDied","Data":"1011ea9428c39e97dcc752018e40935584f58e6af7606effd3cc5ff4cabe4502"} Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.420339 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-4n6fm" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.440122 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-lvh9w" event={"ID":"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5","Type":"ContainerDied","Data":"811e2ff68fa478258e1c4e50442b1168d5fc901f6e84760457134a7a0282c5e1"} Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.440495 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-lvh9w" Mar 18 18:22:09 crc kubenswrapper[5008]: E0318 18:22:09.480007 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" podUID="af39f6a2-816f-4c55-952b-77a69da86828" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.493841 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-config\") pod \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.494040 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v7tt\" (UniqueName: \"kubernetes.io/projected/17dd6da7-ebe7-4b74-a4df-19c6dec82210-kube-api-access-6v7tt\") pod \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\" (UID: \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\") " Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.494268 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rctxd\" (UniqueName: \"kubernetes.io/projected/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-kube-api-access-rctxd\") pod \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.494302 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-dns-svc\") pod \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\" (UID: \"d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5\") " Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.494352 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd6da7-ebe7-4b74-a4df-19c6dec82210-config\") pod \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\" (UID: \"17dd6da7-ebe7-4b74-a4df-19c6dec82210\") " Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.495200 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dd6da7-ebe7-4b74-a4df-19c6dec82210-config" (OuterVolumeSpecName: "config") pod "17dd6da7-ebe7-4b74-a4df-19c6dec82210" (UID: "17dd6da7-ebe7-4b74-a4df-19c6dec82210"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.495268 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-config" (OuterVolumeSpecName: "config") pod "d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5" (UID: "d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.495637 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5" (UID: "d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.511917 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17dd6da7-ebe7-4b74-a4df-19c6dec82210-kube-api-access-6v7tt" (OuterVolumeSpecName: "kube-api-access-6v7tt") pod "17dd6da7-ebe7-4b74-a4df-19c6dec82210" (UID: "17dd6da7-ebe7-4b74-a4df-19c6dec82210"). InnerVolumeSpecName "kube-api-access-6v7tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.526708 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-kube-api-access-rctxd" (OuterVolumeSpecName: "kube-api-access-rctxd") pod "d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5" (UID: "d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5"). InnerVolumeSpecName "kube-api-access-rctxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.604210 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rctxd\" (UniqueName: \"kubernetes.io/projected/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-kube-api-access-rctxd\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.604244 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.604260 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd6da7-ebe7-4b74-a4df-19c6dec82210-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.604271 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.604281 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v7tt\" (UniqueName: \"kubernetes.io/projected/17dd6da7-ebe7-4b74-a4df-19c6dec82210-kube-api-access-6v7tt\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.788568 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-4n6fm"] Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.821612 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-4n6fm"] Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.838368 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-lvh9w"] Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.846418 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-lvh9w"] Mar 18 18:22:09 crc kubenswrapper[5008]: W0318 18:22:09.888735 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07bd6644_ca18_4b8d_ad83_9757257768fb.slice/crio-6b4bbe809426c149275bcc733e3305d775b85bc2c067d17587c0a8995a65da18 WatchSource:0}: Error finding container 6b4bbe809426c149275bcc733e3305d775b85bc2c067d17587c0a8995a65da18: Status 404 returned error can't find the container with id 6b4bbe809426c149275bcc733e3305d775b85bc2c067d17587c0a8995a65da18 Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.905830 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.914424 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:22:09 crc kubenswrapper[5008]: I0318 18:22:09.920534 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.103978 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:22:10 crc kubenswrapper[5008]: W0318 18:22:10.104220 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ede22a_8990_49f1_8bb9_f0c08bb3c8b3.slice/crio-24ae8ff3e7ba0ad952a9120f4330a6b8038c5dfcd70abbd4762bf02cfeca2300 WatchSource:0}: Error finding container 24ae8ff3e7ba0ad952a9120f4330a6b8038c5dfcd70abbd4762bf02cfeca2300: Status 404 returned error can't find the container with id 24ae8ff3e7ba0ad952a9120f4330a6b8038c5dfcd70abbd4762bf02cfeca2300 Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.181431 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-ht8rz"] Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.211421 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17dd6da7-ebe7-4b74-a4df-19c6dec82210" path="/var/lib/kubelet/pods/17dd6da7-ebe7-4b74-a4df-19c6dec82210/volumes" Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.211925 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5" path="/var/lib/kubelet/pods/d4ebdd38-28d8-4664-b8b1-e55af2bc4ac5/volumes" Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.223203 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9qcqj"] Mar 18 18:22:10 crc kubenswrapper[5008]: W0318 18:22:10.226673 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod236a7691_d006_4de5_bd0e_36d58d0e02b4.slice/crio-01a1db1ff58320bc8139343193362a2ccaaaff38a3913c652c8caa3e22cc96bb WatchSource:0}: Error finding container 01a1db1ff58320bc8139343193362a2ccaaaff38a3913c652c8caa3e22cc96bb: Status 404 returned error can't find the container with id 01a1db1ff58320bc8139343193362a2ccaaaff38a3913c652c8caa3e22cc96bb Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.298321 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:22:10 crc kubenswrapper[5008]: W0318 18:22:10.352291 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa3cc5e4_3fd1_48ea_a992_a2b5e76f183c.slice/crio-4410d4ffa290a2192a4a02ef2467e4fab3f152dbd39c8446abba6e1b4840b3e1 WatchSource:0}: Error finding container 4410d4ffa290a2192a4a02ef2467e4fab3f152dbd39c8446abba6e1b4840b3e1: Status 404 returned error can't find the container with id 4410d4ffa290a2192a4a02ef2467e4fab3f152dbd39c8446abba6e1b4840b3e1 Mar 18 18:22:10 crc kubenswrapper[5008]: W0318 18:22:10.355373 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4edb3df2_7960_412a_ba0f_32bd8fdabc86.slice/crio-14f529541100816645fc2e14c39aac9db90e837516c6820a8088594e813b6578 WatchSource:0}: Error finding container 14f529541100816645fc2e14c39aac9db90e837516c6820a8088594e813b6578: Status 404 returned error can't find the container with id 14f529541100816645fc2e14c39aac9db90e837516c6820a8088594e813b6578 Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.414186 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-x8pkm"] Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.451610 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"b61732c3f8965875c6dd9c25b3aae8cc8d81ecff790f1af827da9944801bd467"} Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.453938 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96ede22a-8990-49f1-8bb9-f0c08bb3c8b3","Type":"ContainerStarted","Data":"24ae8ff3e7ba0ad952a9120f4330a6b8038c5dfcd70abbd4762bf02cfeca2300"} Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.455337 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564302-ht8rz" event={"ID":"236a7691-d006-4de5-bd0e-36d58d0e02b4","Type":"ContainerStarted","Data":"01a1db1ff58320bc8139343193362a2ccaaaff38a3913c652c8caa3e22cc96bb"} Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.456422 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6cd78c73-6590-4035-af7d-357b8451f0ad","Type":"ContainerStarted","Data":"6e2596634a432643cb927ad71b51f78ca543e4f3375523619d8da0fd72245781"} Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.457301 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"07bd6644-ca18-4b8d-ad83-9757257768fb","Type":"ContainerStarted","Data":"6b4bbe809426c149275bcc733e3305d775b85bc2c067d17587c0a8995a65da18"} Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.458721 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8724770c-4223-4cfe-b35b-be7cd1a6a9ff","Type":"ContainerStarted","Data":"349d7e3ce0980f2e4b1f9aea6abbe60c002a19772a32990d7ada7a0288f21bbe"} Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.460024 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4edb3df2-7960-412a-ba0f-32bd8fdabc86","Type":"ContainerStarted","Data":"14f529541100816645fc2e14c39aac9db90e837516c6820a8088594e813b6578"} Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.461139 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj" event={"ID":"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c","Type":"ContainerStarted","Data":"4410d4ffa290a2192a4a02ef2467e4fab3f152dbd39c8446abba6e1b4840b3e1"} Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.463389 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x8pkm" event={"ID":"f55031bd-9626-475f-a74f-d0e5f8ec8a66","Type":"ContainerStarted","Data":"a175d6328c98d5d881a54a4fccffe8543c4577dce2965fc3617be2f366d880e5"} Mar 18 18:22:10 crc kubenswrapper[5008]: W0318 18:22:10.923009 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddefaf26d_efb3_4ab4_96fb_fe8826988fe1.slice/crio-c0b3f9548c760476e2019825a5ad59b2c69f08d935e9188ffa27f68089c3a4d9 WatchSource:0}: Error finding container c0b3f9548c760476e2019825a5ad59b2c69f08d935e9188ffa27f68089c3a4d9: Status 404 returned error can't find the container with id c0b3f9548c760476e2019825a5ad59b2c69f08d935e9188ffa27f68089c3a4d9 Mar 18 18:22:10 crc kubenswrapper[5008]: I0318 18:22:10.923881 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:22:11 crc kubenswrapper[5008]: I0318 18:22:11.492518 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3d5f0191-2702-46ed-ab82-e8c93ec1cf02","Type":"ContainerStarted","Data":"0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260"} Mar 18 18:22:11 crc kubenswrapper[5008]: I0318 18:22:11.500221 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"defaf26d-efb3-4ab4-96fb-fe8826988fe1","Type":"ContainerStarted","Data":"c0b3f9548c760476e2019825a5ad59b2c69f08d935e9188ffa27f68089c3a4d9"} Mar 18 18:22:12 crc kubenswrapper[5008]: I0318 18:22:12.507598 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b60d757b-db66-46c1-ad92-4a9e591217a0","Type":"ContainerStarted","Data":"52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2"} Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.103864 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.557259 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6cd78c73-6590-4035-af7d-357b8451f0ad","Type":"ContainerStarted","Data":"a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd"} Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.557679 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.559050 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj" event={"ID":"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c","Type":"ContainerStarted","Data":"17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f"} Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.559173 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.560637 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"07bd6644-ca18-4b8d-ad83-9757257768fb","Type":"ContainerStarted","Data":"3f3486d7e9bc9fc522f426c4d18938f1448d83d50daec3ae5eec87e410c91eb6"} Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.561885 5008 generic.go:334] "Generic (PLEG): container finished" podID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerID="2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2" exitCode=0 Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.561933 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x8pkm" event={"ID":"f55031bd-9626-475f-a74f-d0e5f8ec8a66","Type":"ContainerDied","Data":"2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2"} Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.564523 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8724770c-4223-4cfe-b35b-be7cd1a6a9ff","Type":"ContainerStarted","Data":"4c8e9dc94e541b20c0bd59bf7201a375c6c5baca750bed30c0ce7aadbe5de66b"} Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.565942 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4edb3df2-7960-412a-ba0f-32bd8fdabc86","Type":"ContainerStarted","Data":"7105fd9adfc4911e01e2a18a48dd35e4e9e7daabc38c06b0e726445c47171d4a"} Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.567962 5008 generic.go:334] "Generic (PLEG): container finished" podID="236a7691-d006-4de5-bd0e-36d58d0e02b4" containerID="653fc6fe31e948d5c8417ebf79632448d2a73049f8cf689fc312108d8064ed57" exitCode=0 Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.568033 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564302-ht8rz" event={"ID":"236a7691-d006-4de5-bd0e-36d58d0e02b4","Type":"ContainerDied","Data":"653fc6fe31e948d5c8417ebf79632448d2a73049f8cf689fc312108d8064ed57"} Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.570897 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"defaf26d-efb3-4ab4-96fb-fe8826988fe1","Type":"ContainerStarted","Data":"afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54"} Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.594012 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.070981417 podStartE2EDuration="25.593985983s" podCreationTimestamp="2026-03-18 18:21:53 +0000 UTC" firstStartedPulling="2026-03-18 18:22:09.879818528 +0000 UTC m=+1186.399291607" lastFinishedPulling="2026-03-18 18:22:16.402823094 +0000 UTC m=+1192.922296173" observedRunningTime="2026-03-18 18:22:18.579031329 +0000 UTC m=+1195.098504408" watchObservedRunningTime="2026-03-18 18:22:18.593985983 +0000 UTC m=+1195.113459062" Mar 18 18:22:18 crc kubenswrapper[5008]: I0318 18:22:18.675702 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9qcqj" podStartSLOduration=12.935732349 podStartE2EDuration="19.675679734s" podCreationTimestamp="2026-03-18 18:21:59 +0000 UTC" firstStartedPulling="2026-03-18 18:22:10.354285324 +0000 UTC m=+1186.873758423" lastFinishedPulling="2026-03-18 18:22:17.094232719 +0000 UTC m=+1193.613705808" observedRunningTime="2026-03-18 18:22:18.66708449 +0000 UTC m=+1195.186557589" watchObservedRunningTime="2026-03-18 18:22:18.675679734 +0000 UTC m=+1195.195152833" Mar 18 18:22:19 crc kubenswrapper[5008]: I0318 18:22:19.582534 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x8pkm" event={"ID":"f55031bd-9626-475f-a74f-d0e5f8ec8a66","Type":"ContainerStarted","Data":"90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f"} Mar 18 18:22:19 crc kubenswrapper[5008]: I0318 18:22:19.583316 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x8pkm" event={"ID":"f55031bd-9626-475f-a74f-d0e5f8ec8a66","Type":"ContainerStarted","Data":"240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57"} Mar 18 18:22:19 crc kubenswrapper[5008]: I0318 18:22:19.612422 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-x8pkm" podStartSLOduration=14.627760056 podStartE2EDuration="20.61240451s" podCreationTimestamp="2026-03-18 18:21:59 +0000 UTC" firstStartedPulling="2026-03-18 18:22:10.41818966 +0000 UTC m=+1186.937662739" lastFinishedPulling="2026-03-18 18:22:16.402834114 +0000 UTC m=+1192.922307193" observedRunningTime="2026-03-18 18:22:19.608229765 +0000 UTC m=+1196.127702844" watchObservedRunningTime="2026-03-18 18:22:19.61240451 +0000 UTC m=+1196.131877589" Mar 18 18:22:20 crc kubenswrapper[5008]: I0318 18:22:20.273033 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:20 crc kubenswrapper[5008]: I0318 18:22:20.273364 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.596300 5008 generic.go:334] "Generic (PLEG): container finished" podID="07bd6644-ca18-4b8d-ad83-9757257768fb" containerID="3f3486d7e9bc9fc522f426c4d18938f1448d83d50daec3ae5eec87e410c91eb6" exitCode=0 Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.596389 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"07bd6644-ca18-4b8d-ad83-9757257768fb","Type":"ContainerDied","Data":"3f3486d7e9bc9fc522f426c4d18938f1448d83d50daec3ae5eec87e410c91eb6"} Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.598044 5008 generic.go:334] "Generic (PLEG): container finished" podID="8724770c-4223-4cfe-b35b-be7cd1a6a9ff" containerID="4c8e9dc94e541b20c0bd59bf7201a375c6c5baca750bed30c0ce7aadbe5de66b" exitCode=0 Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.598094 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8724770c-4223-4cfe-b35b-be7cd1a6a9ff","Type":"ContainerDied","Data":"4c8e9dc94e541b20c0bd59bf7201a375c6c5baca750bed30c0ce7aadbe5de66b"} Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.600939 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564302-ht8rz" event={"ID":"236a7691-d006-4de5-bd0e-36d58d0e02b4","Type":"ContainerDied","Data":"01a1db1ff58320bc8139343193362a2ccaaaff38a3913c652c8caa3e22cc96bb"} Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.600967 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01a1db1ff58320bc8139343193362a2ccaaaff38a3913c652c8caa3e22cc96bb" Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.648464 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-ht8rz" Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.815515 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxv9g\" (UniqueName: \"kubernetes.io/projected/236a7691-d006-4de5-bd0e-36d58d0e02b4-kube-api-access-nxv9g\") pod \"236a7691-d006-4de5-bd0e-36d58d0e02b4\" (UID: \"236a7691-d006-4de5-bd0e-36d58d0e02b4\") " Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.825883 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236a7691-d006-4de5-bd0e-36d58d0e02b4-kube-api-access-nxv9g" (OuterVolumeSpecName: "kube-api-access-nxv9g") pod "236a7691-d006-4de5-bd0e-36d58d0e02b4" (UID: "236a7691-d006-4de5-bd0e-36d58d0e02b4"). InnerVolumeSpecName "kube-api-access-nxv9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:21 crc kubenswrapper[5008]: I0318 18:22:21.917379 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxv9g\" (UniqueName: \"kubernetes.io/projected/236a7691-d006-4de5-bd0e-36d58d0e02b4-kube-api-access-nxv9g\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.608039 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96ede22a-8990-49f1-8bb9-f0c08bb3c8b3","Type":"ContainerStarted","Data":"abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee"} Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.609090 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.611542 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4edb3df2-7960-412a-ba0f-32bd8fdabc86","Type":"ContainerStarted","Data":"7e44b2e0f1ce0f0062f29542f63b0c0364c6a86812c315179e49f26887d11a6d"} Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.613507 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"07bd6644-ca18-4b8d-ad83-9757257768fb","Type":"ContainerStarted","Data":"4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01"} Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.615001 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8724770c-4223-4cfe-b35b-be7cd1a6a9ff","Type":"ContainerStarted","Data":"2735f632f3585bddce99d7d606ae426bc322f9f8793ae4e5d5d4ce755bf8652e"} Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.616610 5008 generic.go:334] "Generic (PLEG): container finished" podID="dc9d239d-b8eb-4e7a-a630-52617586149e" containerID="bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd" exitCode=0 Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.616682 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564302-ht8rz" Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.616693 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" event={"ID":"dc9d239d-b8eb-4e7a-a630-52617586149e","Type":"ContainerDied","Data":"bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd"} Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.632929 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.439169593999999 podStartE2EDuration="27.632910591s" podCreationTimestamp="2026-03-18 18:21:55 +0000 UTC" firstStartedPulling="2026-03-18 18:22:10.106384879 +0000 UTC m=+1186.625857958" lastFinishedPulling="2026-03-18 18:22:22.300125876 +0000 UTC m=+1198.819598955" observedRunningTime="2026-03-18 18:22:22.631075635 +0000 UTC m=+1199.150548744" watchObservedRunningTime="2026-03-18 18:22:22.632910591 +0000 UTC m=+1199.152383680" Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.663513 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.558751934 podStartE2EDuration="32.663494365s" podCreationTimestamp="2026-03-18 18:21:50 +0000 UTC" firstStartedPulling="2026-03-18 18:22:09.908838983 +0000 UTC m=+1186.428312062" lastFinishedPulling="2026-03-18 18:22:17.013581424 +0000 UTC m=+1193.533054493" observedRunningTime="2026-03-18 18:22:22.662161692 +0000 UTC m=+1199.181634791" watchObservedRunningTime="2026-03-18 18:22:22.663494365 +0000 UTC m=+1199.182967454" Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.695293 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.951828946 podStartE2EDuration="31.695275629s" podCreationTimestamp="2026-03-18 18:21:51 +0000 UTC" firstStartedPulling="2026-03-18 18:22:09.890158387 +0000 UTC m=+1186.409631466" lastFinishedPulling="2026-03-18 18:22:16.63360506 +0000 UTC m=+1193.153078149" observedRunningTime="2026-03-18 18:22:22.689906285 +0000 UTC m=+1199.209379364" watchObservedRunningTime="2026-03-18 18:22:22.695275629 +0000 UTC m=+1199.214748718" Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.721753 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.78079163 podStartE2EDuration="20.72173518s" podCreationTimestamp="2026-03-18 18:22:02 +0000 UTC" firstStartedPulling="2026-03-18 18:22:10.361268688 +0000 UTC m=+1186.880741767" lastFinishedPulling="2026-03-18 18:22:22.302212238 +0000 UTC m=+1198.821685317" observedRunningTime="2026-03-18 18:22:22.71610989 +0000 UTC m=+1199.235582969" watchObservedRunningTime="2026-03-18 18:22:22.72173518 +0000 UTC m=+1199.241208269" Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.737357 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-8jv4l"] Mar 18 18:22:22 crc kubenswrapper[5008]: I0318 18:22:22.744002 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564296-8jv4l"] Mar 18 18:22:23 crc kubenswrapper[5008]: I0318 18:22:23.421150 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:23 crc kubenswrapper[5008]: I0318 18:22:23.554701 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 18:22:23 crc kubenswrapper[5008]: I0318 18:22:23.554945 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 18:22:23 crc kubenswrapper[5008]: I0318 18:22:23.629458 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" event={"ID":"dc9d239d-b8eb-4e7a-a630-52617586149e","Type":"ContainerStarted","Data":"c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228"} Mar 18 18:22:23 crc kubenswrapper[5008]: I0318 18:22:23.630771 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:22:23 crc kubenswrapper[5008]: I0318 18:22:23.633752 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"defaf26d-efb3-4ab4-96fb-fe8826988fe1","Type":"ContainerStarted","Data":"5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0"} Mar 18 18:22:23 crc kubenswrapper[5008]: I0318 18:22:23.650985 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 18:22:23 crc kubenswrapper[5008]: I0318 18:22:23.656211 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" podStartSLOduration=2.444154914 podStartE2EDuration="35.656188608s" podCreationTimestamp="2026-03-18 18:21:48 +0000 UTC" firstStartedPulling="2026-03-18 18:21:49.091723242 +0000 UTC m=+1165.611196321" lastFinishedPulling="2026-03-18 18:22:22.303756936 +0000 UTC m=+1198.823230015" observedRunningTime="2026-03-18 18:22:23.654222469 +0000 UTC m=+1200.173695558" watchObservedRunningTime="2026-03-18 18:22:23.656188608 +0000 UTC m=+1200.175661697" Mar 18 18:22:23 crc kubenswrapper[5008]: I0318 18:22:23.727882 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.349122787 podStartE2EDuration="24.727854619s" podCreationTimestamp="2026-03-18 18:21:59 +0000 UTC" firstStartedPulling="2026-03-18 18:22:10.925902656 +0000 UTC m=+1187.445375735" lastFinishedPulling="2026-03-18 18:22:22.304634488 +0000 UTC m=+1198.824107567" observedRunningTime="2026-03-18 18:22:23.717130091 +0000 UTC m=+1200.236603180" watchObservedRunningTime="2026-03-18 18:22:23.727854619 +0000 UTC m=+1200.247327748" Mar 18 18:22:24 crc kubenswrapper[5008]: I0318 18:22:24.208793 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c5dcfb-6da1-4284-9d73-08a9e08a3e85" path="/var/lib/kubelet/pods/25c5dcfb-6da1-4284-9d73-08a9e08a3e85/volumes" Mar 18 18:22:24 crc kubenswrapper[5008]: I0318 18:22:24.421708 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:24 crc kubenswrapper[5008]: I0318 18:22:24.460464 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:24 crc kubenswrapper[5008]: I0318 18:22:24.679238 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 18:22:24 crc kubenswrapper[5008]: I0318 18:22:24.995422 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-qqz54"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.020425 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ccf85c649-bxxns"] Mar 18 18:22:25 crc kubenswrapper[5008]: E0318 18:22:25.020725 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236a7691-d006-4de5-bd0e-36d58d0e02b4" containerName="oc" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.020740 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="236a7691-d006-4de5-bd0e-36d58d0e02b4" containerName="oc" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.020912 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="236a7691-d006-4de5-bd0e-36d58d0e02b4" containerName="oc" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.021670 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.025189 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.032039 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ccf85c649-bxxns"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.104696 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-78xsw"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.105604 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.118855 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.119499 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-78xsw"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.148768 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.173835 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-dns-svc\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.173887 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-ovsdbserver-sb\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.173908 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-config\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.173951 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd64r\" (UniqueName: \"kubernetes.io/projected/50828756-f458-460c-9b9a-c6890f956394-kube-api-access-wd64r\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.192800 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.274948 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-dns-svc\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275008 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-ovsdbserver-sb\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275031 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-config\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275108 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-config\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275130 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovs-rundir\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275170 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd64r\" (UniqueName: \"kubernetes.io/projected/50828756-f458-460c-9b9a-c6890f956394-kube-api-access-wd64r\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275185 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovn-rundir\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275217 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfhkp\" (UniqueName: \"kubernetes.io/projected/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-kube-api-access-sfhkp\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275242 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275273 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-combined-ca-bundle\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275877 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-dns-svc\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.275936 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-ovsdbserver-sb\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.276468 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-config\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.293408 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd64r\" (UniqueName: \"kubernetes.io/projected/50828756-f458-460c-9b9a-c6890f956394-kube-api-access-wd64r\") pod \"dnsmasq-dns-7ccf85c649-bxxns\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.341976 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.377494 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-config\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.377535 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovs-rundir\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.377610 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovn-rundir\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.377646 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfhkp\" (UniqueName: \"kubernetes.io/projected/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-kube-api-access-sfhkp\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.377672 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.377746 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-combined-ca-bundle\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.378712 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovs-rundir\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.379262 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-config\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.379495 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovn-rundir\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.385287 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.394028 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-combined-ca-bundle\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.396064 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-4kgnr"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.407685 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfhkp\" (UniqueName: \"kubernetes.io/projected/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-kube-api-access-sfhkp\") pod \"ovn-controller-metrics-78xsw\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.419827 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.425105 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-94sv2"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.429705 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.445624 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.487145 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-94sv2"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.585992 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.587361 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-config\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.587529 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.587694 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shxm7\" (UniqueName: \"kubernetes.io/projected/b622014a-11bc-48b9-9960-08670363a6a5-kube-api-access-shxm7\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.587729 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-dns-svc\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.664185 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ccf85c649-bxxns"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.674541 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" podUID="dc9d239d-b8eb-4e7a-a630-52617586149e" containerName="dnsmasq-dns" containerID="cri-o://c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228" gracePeriod=10 Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.674789 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" event={"ID":"af39f6a2-816f-4c55-952b-77a69da86828","Type":"ContainerStarted","Data":"404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b"} Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.675015 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.703043 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shxm7\" (UniqueName: \"kubernetes.io/projected/b622014a-11bc-48b9-9960-08670363a6a5-kube-api-access-shxm7\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.703089 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-dns-svc\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.703126 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.703151 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-config\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.703179 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.711221 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.712164 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.712419 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-config\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.712688 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-dns-svc\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.744180 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-8xdxz"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.746004 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.756107 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.757116 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shxm7\" (UniqueName: \"kubernetes.io/projected/b622014a-11bc-48b9-9960-08670363a6a5-kube-api-access-shxm7\") pod \"dnsmasq-dns-f697c8bff-94sv2\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.757997 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-8xdxz"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.910899 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-config\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.910959 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.910985 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dwpw\" (UniqueName: \"kubernetes.io/projected/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-kube-api-access-6dwpw\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.911022 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.911052 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.982293 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:22:25 crc kubenswrapper[5008]: I0318 18:22:25.983763 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.000715 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.000801 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-2d4ff" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.000839 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.000894 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.006972 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.012224 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.012284 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.012346 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-config\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.012383 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.012404 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwpw\" (UniqueName: \"kubernetes.io/projected/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-kube-api-access-6dwpw\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.013406 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.013832 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-config\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.013960 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.014415 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.034655 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ccf85c649-bxxns"] Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.037752 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwpw\" (UniqueName: \"kubernetes.io/projected/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-kube-api-access-6dwpw\") pod \"dnsmasq-dns-b4ddd5fb7-8xdxz\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.064844 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.114954 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.115281 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-config\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.115344 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.115415 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.115584 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.115608 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.115705 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nc8m\" (UniqueName: \"kubernetes.io/projected/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-kube-api-access-4nc8m\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.115741 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-scripts\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.130088 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.166483 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.185620 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-78xsw"] Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.217770 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-dns-svc\") pod \"dc9d239d-b8eb-4e7a-a630-52617586149e\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.217847 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tjvq\" (UniqueName: \"kubernetes.io/projected/dc9d239d-b8eb-4e7a-a630-52617586149e-kube-api-access-2tjvq\") pod \"dc9d239d-b8eb-4e7a-a630-52617586149e\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.217917 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-config\") pod \"dc9d239d-b8eb-4e7a-a630-52617586149e\" (UID: \"dc9d239d-b8eb-4e7a-a630-52617586149e\") " Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.218302 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.218362 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nc8m\" (UniqueName: \"kubernetes.io/projected/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-kube-api-access-4nc8m\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.218416 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-scripts\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.218450 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-config\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.218473 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.218621 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.218688 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.220514 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-config\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.220746 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-scripts\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.221142 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.226823 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.227263 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.240116 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc9d239d-b8eb-4e7a-a630-52617586149e-kube-api-access-2tjvq" (OuterVolumeSpecName: "kube-api-access-2tjvq") pod "dc9d239d-b8eb-4e7a-a630-52617586149e" (UID: "dc9d239d-b8eb-4e7a-a630-52617586149e"). InnerVolumeSpecName "kube-api-access-2tjvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.247572 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.263346 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nc8m\" (UniqueName: \"kubernetes.io/projected/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-kube-api-access-4nc8m\") pod \"ovn-northd-0\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.297058 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.319527 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z4nk\" (UniqueName: \"kubernetes.io/projected/af39f6a2-816f-4c55-952b-77a69da86828-kube-api-access-8z4nk\") pod \"af39f6a2-816f-4c55-952b-77a69da86828\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.319657 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-config\") pod \"af39f6a2-816f-4c55-952b-77a69da86828\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.319712 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-dns-svc\") pod \"af39f6a2-816f-4c55-952b-77a69da86828\" (UID: \"af39f6a2-816f-4c55-952b-77a69da86828\") " Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.320171 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tjvq\" (UniqueName: \"kubernetes.io/projected/dc9d239d-b8eb-4e7a-a630-52617586149e-kube-api-access-2tjvq\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.327213 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af39f6a2-816f-4c55-952b-77a69da86828-kube-api-access-8z4nk" (OuterVolumeSpecName: "kube-api-access-8z4nk") pod "af39f6a2-816f-4c55-952b-77a69da86828" (UID: "af39f6a2-816f-4c55-952b-77a69da86828"). InnerVolumeSpecName "kube-api-access-8z4nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.350536 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-config" (OuterVolumeSpecName: "config") pod "dc9d239d-b8eb-4e7a-a630-52617586149e" (UID: "dc9d239d-b8eb-4e7a-a630-52617586149e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.395888 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc9d239d-b8eb-4e7a-a630-52617586149e" (UID: "dc9d239d-b8eb-4e7a-a630-52617586149e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.395915 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-config" (OuterVolumeSpecName: "config") pod "af39f6a2-816f-4c55-952b-77a69da86828" (UID: "af39f6a2-816f-4c55-952b-77a69da86828"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.403344 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af39f6a2-816f-4c55-952b-77a69da86828" (UID: "af39f6a2-816f-4c55-952b-77a69da86828"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.423515 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.423577 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z4nk\" (UniqueName: \"kubernetes.io/projected/af39f6a2-816f-4c55-952b-77a69da86828-kube-api-access-8z4nk\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.423589 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.423597 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af39f6a2-816f-4c55-952b-77a69da86828-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.423607 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9d239d-b8eb-4e7a-a630-52617586149e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.665241 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-8xdxz"] Mar 18 18:22:26 crc kubenswrapper[5008]: W0318 18:22:26.687876 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b085aa0_d1ca_47a4_9b12_588dc9be67fe.slice/crio-2ed2580cbeeff737c4d9a1292396a4ee8b39ba25e773cc3c4a51adce9e93a7cd WatchSource:0}: Error finding container 2ed2580cbeeff737c4d9a1292396a4ee8b39ba25e773cc3c4a51adce9e93a7cd: Status 404 returned error can't find the container with id 2ed2580cbeeff737c4d9a1292396a4ee8b39ba25e773cc3c4a51adce9e93a7cd Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.701381 5008 generic.go:334] "Generic (PLEG): container finished" podID="dc9d239d-b8eb-4e7a-a630-52617586149e" containerID="c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228" exitCode=0 Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.701425 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.701459 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" event={"ID":"dc9d239d-b8eb-4e7a-a630-52617586149e","Type":"ContainerDied","Data":"c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228"} Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.701518 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f47b4f9-qqz54" event={"ID":"dc9d239d-b8eb-4e7a-a630-52617586149e","Type":"ContainerDied","Data":"0256b436e152155e6c17ada2b8cc51c705b70346777fe114ad5cb1cc704384d4"} Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.701542 5008 scope.go:117] "RemoveContainer" containerID="c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.703438 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" event={"ID":"9b085aa0-d1ca-47a4-9b12-588dc9be67fe","Type":"ContainerStarted","Data":"2ed2580cbeeff737c4d9a1292396a4ee8b39ba25e773cc3c4a51adce9e93a7cd"} Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.705430 5008 generic.go:334] "Generic (PLEG): container finished" podID="50828756-f458-460c-9b9a-c6890f956394" containerID="3d8a5d075842b9b906d09b36823b74a0a7d70a9fe9a3ff06576568363fde3942" exitCode=0 Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.705675 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" event={"ID":"50828756-f458-460c-9b9a-c6890f956394","Type":"ContainerDied","Data":"3d8a5d075842b9b906d09b36823b74a0a7d70a9fe9a3ff06576568363fde3942"} Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.705776 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" event={"ID":"50828756-f458-460c-9b9a-c6890f956394","Type":"ContainerStarted","Data":"58d4604c7a56aa71a8300a77bd5e7c8fb5d1b67eb4dca8c69ae1a9b1916c44be"} Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.708738 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-78xsw" event={"ID":"a8857503-cb26-46f0-b4a3-e931a9e3f1ed","Type":"ContainerStarted","Data":"82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae"} Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.708780 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-78xsw" event={"ID":"a8857503-cb26-46f0-b4a3-e931a9e3f1ed","Type":"ContainerStarted","Data":"c457a9b60742c8038f2b523b9bba086cb3dcaa191c37cf8408df50194e8210c2"} Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.710876 5008 generic.go:334] "Generic (PLEG): container finished" podID="af39f6a2-816f-4c55-952b-77a69da86828" containerID="404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b" exitCode=0 Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.711462 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.712039 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" event={"ID":"af39f6a2-816f-4c55-952b-77a69da86828","Type":"ContainerDied","Data":"404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b"} Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.712069 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-4kgnr" event={"ID":"af39f6a2-816f-4c55-952b-77a69da86828","Type":"ContainerDied","Data":"263ac0d9acbc0baf7c7fc5dd1f964337609f6fe152cc9335a4d7dad7736aee5d"} Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.722088 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-94sv2"] Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.742052 5008 scope.go:117] "RemoveContainer" containerID="bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.807723 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-4kgnr"] Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.813772 5008 scope.go:117] "RemoveContainer" containerID="c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228" Mar 18 18:22:26 crc kubenswrapper[5008]: E0318 18:22:26.815004 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228\": container with ID starting with c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228 not found: ID does not exist" containerID="c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.815061 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228"} err="failed to get container status \"c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228\": rpc error: code = NotFound desc = could not find container \"c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228\": container with ID starting with c1567caaf7d36b22bacdb8c91403da3831b1448abdbf2e8902c28f93a9017228 not found: ID does not exist" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.815092 5008 scope.go:117] "RemoveContainer" containerID="bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd" Mar 18 18:22:26 crc kubenswrapper[5008]: E0318 18:22:26.815487 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd\": container with ID starting with bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd not found: ID does not exist" containerID="bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.815516 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd"} err="failed to get container status \"bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd\": rpc error: code = NotFound desc = could not find container \"bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd\": container with ID starting with bbbb55c5eb70c00069c749cc5769d545b334bdf817a7b5f735650f820f67d5fd not found: ID does not exist" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.815531 5008 scope.go:117] "RemoveContainer" containerID="404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.833920 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-4kgnr"] Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.871681 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-qqz54"] Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.873762 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-854f47b4f9-qqz54"] Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.901915 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:22:26 crc kubenswrapper[5008]: E0318 18:22:26.902861 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af39f6a2-816f-4c55-952b-77a69da86828" containerName="init" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.902878 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="af39f6a2-816f-4c55-952b-77a69da86828" containerName="init" Mar 18 18:22:26 crc kubenswrapper[5008]: E0318 18:22:26.902938 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9d239d-b8eb-4e7a-a630-52617586149e" containerName="dnsmasq-dns" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.902946 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9d239d-b8eb-4e7a-a630-52617586149e" containerName="dnsmasq-dns" Mar 18 18:22:26 crc kubenswrapper[5008]: E0318 18:22:26.902969 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9d239d-b8eb-4e7a-a630-52617586149e" containerName="init" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.902975 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9d239d-b8eb-4e7a-a630-52617586149e" containerName="init" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.903280 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9d239d-b8eb-4e7a-a630-52617586149e" containerName="dnsmasq-dns" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.903308 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="af39f6a2-816f-4c55-952b-77a69da86828" containerName="init" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.980352 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.982842 5008 scope.go:117] "RemoveContainer" containerID="404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.983141 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 18:22:26 crc kubenswrapper[5008]: E0318 18:22:26.984636 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b\": container with ID starting with 404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b not found: ID does not exist" containerID="404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.984677 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b"} err="failed to get container status \"404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b\": rpc error: code = NotFound desc = could not find container \"404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b\": container with ID starting with 404bb7bcfa64a7d767749eca17b0110c441906ac9f1740647c3deb4b89487b4b not found: ID does not exist" Mar 18 18:22:26 crc kubenswrapper[5008]: W0318 18:22:26.984749 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f1c2fc8_83c6_4183_ac62_f23ad5db8610.slice/crio-206cf3d2c63c905b33d66457e0c59d6b9a03aa56d755f65c7bf6f090188ab2df WatchSource:0}: Error finding container 206cf3d2c63c905b33d66457e0c59d6b9a03aa56d755f65c7bf6f090188ab2df: Status 404 returned error can't find the container with id 206cf3d2c63c905b33d66457e0c59d6b9a03aa56d755f65c7bf6f090188ab2df Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.985006 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2l7hk" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.985175 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 18:22:26 crc kubenswrapper[5008]: I0318 18:22:26.985546 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.019645 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-78xsw" podStartSLOduration=2.019632078 podStartE2EDuration="2.019632078s" podCreationTimestamp="2026-03-18 18:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:26.777219471 +0000 UTC m=+1203.296692560" watchObservedRunningTime="2026-03-18 18:22:27.019632078 +0000 UTC m=+1203.539105147" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.044440 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.048368 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nrx4\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-kube-api-access-5nrx4\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.048434 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.048492 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.048521 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-cache\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.048536 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb3859a-2fc0-4479-a59d-7888246899a9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.048579 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-lock\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.067199 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.155791 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.155870 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.155909 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-cache\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.155924 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb3859a-2fc0-4479-a59d-7888246899a9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.155953 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-lock\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.155989 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nrx4\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-kube-api-access-5nrx4\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.156495 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.158083 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-cache\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.158665 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-lock\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: E0318 18:22:27.158688 5008 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:22:27 crc kubenswrapper[5008]: E0318 18:22:27.158720 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:22:27 crc kubenswrapper[5008]: E0318 18:22:27.158793 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift podName:fcb3859a-2fc0-4479-a59d-7888246899a9 nodeName:}" failed. No retries permitted until 2026-03-18 18:22:27.658768874 +0000 UTC m=+1204.178242043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift") pod "swift-storage-0" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9") : configmap "swift-ring-files" not found Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.168285 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb3859a-2fc0-4479-a59d-7888246899a9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.179680 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nrx4\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-kube-api-access-5nrx4\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.223133 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.314839 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.331453 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-klvjh"] Mar 18 18:22:27 crc kubenswrapper[5008]: E0318 18:22:27.331803 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50828756-f458-460c-9b9a-c6890f956394" containerName="init" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.331819 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="50828756-f458-460c-9b9a-c6890f956394" containerName="init" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.331979 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="50828756-f458-460c-9b9a-c6890f956394" containerName="init" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.332455 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.335192 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.337343 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.337379 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.355976 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-klvjh"] Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.357718 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-ovsdbserver-sb\") pod \"50828756-f458-460c-9b9a-c6890f956394\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.357759 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd64r\" (UniqueName: \"kubernetes.io/projected/50828756-f458-460c-9b9a-c6890f956394-kube-api-access-wd64r\") pod \"50828756-f458-460c-9b9a-c6890f956394\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.357841 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-dns-svc\") pod \"50828756-f458-460c-9b9a-c6890f956394\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.357862 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-config\") pod \"50828756-f458-460c-9b9a-c6890f956394\" (UID: \"50828756-f458-460c-9b9a-c6890f956394\") " Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.357965 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-combined-ca-bundle\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.357994 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85kf2\" (UniqueName: \"kubernetes.io/projected/a03defc9-9b67-47f0-b87a-ed5345e84c18-kube-api-access-85kf2\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.358013 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a03defc9-9b67-47f0-b87a-ed5345e84c18-etc-swift\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.358034 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-scripts\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.358075 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-ring-data-devices\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.358107 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-swiftconf\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.358132 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-dispersionconf\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.370782 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50828756-f458-460c-9b9a-c6890f956394-kube-api-access-wd64r" (OuterVolumeSpecName: "kube-api-access-wd64r") pod "50828756-f458-460c-9b9a-c6890f956394" (UID: "50828756-f458-460c-9b9a-c6890f956394"). InnerVolumeSpecName "kube-api-access-wd64r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.376308 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-config" (OuterVolumeSpecName: "config") pod "50828756-f458-460c-9b9a-c6890f956394" (UID: "50828756-f458-460c-9b9a-c6890f956394"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.376682 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50828756-f458-460c-9b9a-c6890f956394" (UID: "50828756-f458-460c-9b9a-c6890f956394"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.378024 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50828756-f458-460c-9b9a-c6890f956394" (UID: "50828756-f458-460c-9b9a-c6890f956394"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459096 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-combined-ca-bundle\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459133 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85kf2\" (UniqueName: \"kubernetes.io/projected/a03defc9-9b67-47f0-b87a-ed5345e84c18-kube-api-access-85kf2\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459155 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a03defc9-9b67-47f0-b87a-ed5345e84c18-etc-swift\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459179 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-scripts\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459220 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-ring-data-devices\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459248 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-swiftconf\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459272 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-dispersionconf\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459335 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459348 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd64r\" (UniqueName: \"kubernetes.io/projected/50828756-f458-460c-9b9a-c6890f956394-kube-api-access-wd64r\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459360 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.459370 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50828756-f458-460c-9b9a-c6890f956394-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.460179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-scripts\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.460201 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-ring-data-devices\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.460663 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a03defc9-9b67-47f0-b87a-ed5345e84c18-etc-swift\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.462144 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-dispersionconf\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.462263 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-combined-ca-bundle\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.463084 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-swiftconf\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.474654 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85kf2\" (UniqueName: \"kubernetes.io/projected/a03defc9-9b67-47f0-b87a-ed5345e84c18-kube-api-access-85kf2\") pod \"swift-ring-rebalance-klvjh\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.652923 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.662494 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:27 crc kubenswrapper[5008]: E0318 18:22:27.662731 5008 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:22:27 crc kubenswrapper[5008]: E0318 18:22:27.662769 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:22:27 crc kubenswrapper[5008]: E0318 18:22:27.662841 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift podName:fcb3859a-2fc0-4479-a59d-7888246899a9 nodeName:}" failed. No retries permitted until 2026-03-18 18:22:28.662819369 +0000 UTC m=+1205.182292458 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift") pod "swift-storage-0" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9") : configmap "swift-ring-files" not found Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.724488 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.724614 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ccf85c649-bxxns" event={"ID":"50828756-f458-460c-9b9a-c6890f956394","Type":"ContainerDied","Data":"58d4604c7a56aa71a8300a77bd5e7c8fb5d1b67eb4dca8c69ae1a9b1916c44be"} Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.724707 5008 scope.go:117] "RemoveContainer" containerID="3d8a5d075842b9b906d09b36823b74a0a7d70a9fe9a3ff06576568363fde3942" Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.727335 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7f1c2fc8-83c6-4183-ac62-f23ad5db8610","Type":"ContainerStarted","Data":"206cf3d2c63c905b33d66457e0c59d6b9a03aa56d755f65c7bf6f090188ab2df"} Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.732961 5008 generic.go:334] "Generic (PLEG): container finished" podID="b622014a-11bc-48b9-9960-08670363a6a5" containerID="4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c" exitCode=0 Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.733035 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" event={"ID":"b622014a-11bc-48b9-9960-08670363a6a5","Type":"ContainerDied","Data":"4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c"} Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.733113 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" event={"ID":"b622014a-11bc-48b9-9960-08670363a6a5","Type":"ContainerStarted","Data":"39a4e62c66a3b4d505c4a1218f5a2a86ca355d1f6796c0dc86a531cfb57727b0"} Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.740402 5008 generic.go:334] "Generic (PLEG): container finished" podID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerID="34521d9f15a48c44fcbedade648e3cbe3d7738b22fb676c0b85404a2785ed4ff" exitCode=0 Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.740645 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" event={"ID":"9b085aa0-d1ca-47a4-9b12-588dc9be67fe","Type":"ContainerDied","Data":"34521d9f15a48c44fcbedade648e3cbe3d7738b22fb676c0b85404a2785ed4ff"} Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.817525 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ccf85c649-bxxns"] Mar 18 18:22:27 crc kubenswrapper[5008]: I0318 18:22:27.836254 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ccf85c649-bxxns"] Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.133709 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-klvjh"] Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.214943 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50828756-f458-460c-9b9a-c6890f956394" path="/var/lib/kubelet/pods/50828756-f458-460c-9b9a-c6890f956394/volumes" Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.215726 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af39f6a2-816f-4c55-952b-77a69da86828" path="/var/lib/kubelet/pods/af39f6a2-816f-4c55-952b-77a69da86828/volumes" Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.216165 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc9d239d-b8eb-4e7a-a630-52617586149e" path="/var/lib/kubelet/pods/dc9d239d-b8eb-4e7a-a630-52617586149e/volumes" Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.683803 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:28 crc kubenswrapper[5008]: E0318 18:22:28.684004 5008 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:22:28 crc kubenswrapper[5008]: E0318 18:22:28.684170 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:22:28 crc kubenswrapper[5008]: E0318 18:22:28.684221 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift podName:fcb3859a-2fc0-4479-a59d-7888246899a9 nodeName:}" failed. No retries permitted until 2026-03-18 18:22:30.68420364 +0000 UTC m=+1207.203676719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift") pod "swift-storage-0" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9") : configmap "swift-ring-files" not found Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.750131 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" event={"ID":"b622014a-11bc-48b9-9960-08670363a6a5","Type":"ContainerStarted","Data":"ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca"} Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.751193 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.752177 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-klvjh" event={"ID":"a03defc9-9b67-47f0-b87a-ed5345e84c18","Type":"ContainerStarted","Data":"5ccd4cc029e26e7b51514cfe4a41a0c95fbb3f0d7d8e9626b4ab9c855e30dab9"} Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.753682 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" event={"ID":"9b085aa0-d1ca-47a4-9b12-588dc9be67fe","Type":"ContainerStarted","Data":"16f326ba4481d5794a3cf862529549c6f7b6bc1c7a74670132db25e791e7ddc5"} Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.754189 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.757999 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7f1c2fc8-83c6-4183-ac62-f23ad5db8610","Type":"ContainerStarted","Data":"c17a1b2d5a41cb5fcdc52c5477e557d5c46788d343c6e07f1cda8f8d094698b3"} Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.758031 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7f1c2fc8-83c6-4183-ac62-f23ad5db8610","Type":"ContainerStarted","Data":"a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac"} Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.758625 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.779093 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" podStartSLOduration=3.77907629 podStartE2EDuration="3.77907629s" podCreationTimestamp="2026-03-18 18:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:28.778105016 +0000 UTC m=+1205.297578145" watchObservedRunningTime="2026-03-18 18:22:28.77907629 +0000 UTC m=+1205.298549369" Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.800805 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" podStartSLOduration=3.800787693 podStartE2EDuration="3.800787693s" podCreationTimestamp="2026-03-18 18:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:28.797300946 +0000 UTC m=+1205.316774045" watchObservedRunningTime="2026-03-18 18:22:28.800787693 +0000 UTC m=+1205.320260772" Mar 18 18:22:28 crc kubenswrapper[5008]: I0318 18:22:28.837405 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.5315466989999997 podStartE2EDuration="3.837388007s" podCreationTimestamp="2026-03-18 18:22:25 +0000 UTC" firstStartedPulling="2026-03-18 18:22:27.037491795 +0000 UTC m=+1203.556964874" lastFinishedPulling="2026-03-18 18:22:28.343333103 +0000 UTC m=+1204.862806182" observedRunningTime="2026-03-18 18:22:28.830062684 +0000 UTC m=+1205.349535773" watchObservedRunningTime="2026-03-18 18:22:28.837388007 +0000 UTC m=+1205.356861086" Mar 18 18:22:29 crc kubenswrapper[5008]: I0318 18:22:29.696163 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 18:22:29 crc kubenswrapper[5008]: I0318 18:22:29.781342 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 18:22:30 crc kubenswrapper[5008]: I0318 18:22:30.737587 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:30 crc kubenswrapper[5008]: E0318 18:22:30.737792 5008 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:22:30 crc kubenswrapper[5008]: E0318 18:22:30.737904 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:22:30 crc kubenswrapper[5008]: E0318 18:22:30.737960 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift podName:fcb3859a-2fc0-4479-a59d-7888246899a9 nodeName:}" failed. No retries permitted until 2026-03-18 18:22:34.737943373 +0000 UTC m=+1211.257416452 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift") pod "swift-storage-0" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9") : configmap "swift-ring-files" not found Mar 18 18:22:31 crc kubenswrapper[5008]: I0318 18:22:31.817505 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 18:22:31 crc kubenswrapper[5008]: I0318 18:22:31.818054 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 18:22:31 crc kubenswrapper[5008]: I0318 18:22:31.985078 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.032003 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-b5z8q"] Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.033214 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.035322 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.042705 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b5z8q"] Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.163277 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9bfaa97-2440-4ee9-8e14-893f6ad81460-operator-scripts\") pod \"root-account-create-update-b5z8q\" (UID: \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\") " pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.163354 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nm26\" (UniqueName: \"kubernetes.io/projected/b9bfaa97-2440-4ee9-8e14-893f6ad81460-kube-api-access-4nm26\") pod \"root-account-create-update-b5z8q\" (UID: \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\") " pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.266819 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9bfaa97-2440-4ee9-8e14-893f6ad81460-operator-scripts\") pod \"root-account-create-update-b5z8q\" (UID: \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\") " pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.266895 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nm26\" (UniqueName: \"kubernetes.io/projected/b9bfaa97-2440-4ee9-8e14-893f6ad81460-kube-api-access-4nm26\") pod \"root-account-create-update-b5z8q\" (UID: \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\") " pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.267751 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9bfaa97-2440-4ee9-8e14-893f6ad81460-operator-scripts\") pod \"root-account-create-update-b5z8q\" (UID: \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\") " pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.287582 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nm26\" (UniqueName: \"kubernetes.io/projected/b9bfaa97-2440-4ee9-8e14-893f6ad81460-kube-api-access-4nm26\") pod \"root-account-create-update-b5z8q\" (UID: \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\") " pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.357305 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.778538 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b5z8q"] Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.819037 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-klvjh" event={"ID":"a03defc9-9b67-47f0-b87a-ed5345e84c18","Type":"ContainerStarted","Data":"a3defe79165feeaba9c65d446da97fc8a83798ed98fa4748fbf2caaeb30f67d5"} Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.820087 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b5z8q" event={"ID":"b9bfaa97-2440-4ee9-8e14-893f6ad81460","Type":"ContainerStarted","Data":"3bd474cfb131675234b2ba93ece625d1ee9adbcb2f40647fdc0bd17783a4f317"} Mar 18 18:22:32 crc kubenswrapper[5008]: I0318 18:22:32.851114 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-klvjh" podStartSLOduration=1.907771597 podStartE2EDuration="5.851094294s" podCreationTimestamp="2026-03-18 18:22:27 +0000 UTC" firstStartedPulling="2026-03-18 18:22:28.310079532 +0000 UTC m=+1204.829552611" lastFinishedPulling="2026-03-18 18:22:32.253402229 +0000 UTC m=+1208.772875308" observedRunningTime="2026-03-18 18:22:32.849959935 +0000 UTC m=+1209.369433014" watchObservedRunningTime="2026-03-18 18:22:32.851094294 +0000 UTC m=+1209.370567373" Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.046850 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.827035 5008 generic.go:334] "Generic (PLEG): container finished" podID="b9bfaa97-2440-4ee9-8e14-893f6ad81460" containerID="b19a7df1f361dc1f5f1167306b019694b44a326d6b04be0b9566e13afd0d3b22" exitCode=0 Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.827157 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b5z8q" event={"ID":"b9bfaa97-2440-4ee9-8e14-893f6ad81460","Type":"ContainerDied","Data":"b19a7df1f361dc1f5f1167306b019694b44a326d6b04be0b9566e13afd0d3b22"} Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.873202 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-f7btr"] Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.874376 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f7btr" Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.879789 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-f7btr"] Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.979198 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-835c-account-create-update-6znqk"] Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.980108 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.983047 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.994283 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-835c-account-create-update-6znqk"] Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.994467 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad5b158-154c-4219-8a1d-d6df23e11d42-operator-scripts\") pod \"glance-db-create-f7btr\" (UID: \"4ad5b158-154c-4219-8a1d-d6df23e11d42\") " pod="openstack/glance-db-create-f7btr" Mar 18 18:22:33 crc kubenswrapper[5008]: I0318 18:22:33.994515 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6drr\" (UniqueName: \"kubernetes.io/projected/4ad5b158-154c-4219-8a1d-d6df23e11d42-kube-api-access-c6drr\") pod \"glance-db-create-f7btr\" (UID: \"4ad5b158-154c-4219-8a1d-d6df23e11d42\") " pod="openstack/glance-db-create-f7btr" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.096421 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp7fz\" (UniqueName: \"kubernetes.io/projected/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-kube-api-access-gp7fz\") pod \"glance-835c-account-create-update-6znqk\" (UID: \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\") " pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.096501 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad5b158-154c-4219-8a1d-d6df23e11d42-operator-scripts\") pod \"glance-db-create-f7btr\" (UID: \"4ad5b158-154c-4219-8a1d-d6df23e11d42\") " pod="openstack/glance-db-create-f7btr" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.096604 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6drr\" (UniqueName: \"kubernetes.io/projected/4ad5b158-154c-4219-8a1d-d6df23e11d42-kube-api-access-c6drr\") pod \"glance-db-create-f7btr\" (UID: \"4ad5b158-154c-4219-8a1d-d6df23e11d42\") " pod="openstack/glance-db-create-f7btr" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.096765 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-operator-scripts\") pod \"glance-835c-account-create-update-6znqk\" (UID: \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\") " pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.097351 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad5b158-154c-4219-8a1d-d6df23e11d42-operator-scripts\") pod \"glance-db-create-f7btr\" (UID: \"4ad5b158-154c-4219-8a1d-d6df23e11d42\") " pod="openstack/glance-db-create-f7btr" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.120245 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6drr\" (UniqueName: \"kubernetes.io/projected/4ad5b158-154c-4219-8a1d-d6df23e11d42-kube-api-access-c6drr\") pod \"glance-db-create-f7btr\" (UID: \"4ad5b158-154c-4219-8a1d-d6df23e11d42\") " pod="openstack/glance-db-create-f7btr" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.198830 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-operator-scripts\") pod \"glance-835c-account-create-update-6znqk\" (UID: \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\") " pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.198988 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp7fz\" (UniqueName: \"kubernetes.io/projected/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-kube-api-access-gp7fz\") pod \"glance-835c-account-create-update-6znqk\" (UID: \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\") " pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.200065 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-operator-scripts\") pod \"glance-835c-account-create-update-6znqk\" (UID: \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\") " pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.205875 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f7btr" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.223004 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp7fz\" (UniqueName: \"kubernetes.io/projected/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-kube-api-access-gp7fz\") pod \"glance-835c-account-create-update-6znqk\" (UID: \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\") " pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.298118 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.548461 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-q2tvv"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.549860 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.559143 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q2tvv"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.625128 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctb8s\" (UniqueName: \"kubernetes.io/projected/51f847fb-ea4e-4c60-82ab-8401eb7bf256-kube-api-access-ctb8s\") pod \"keystone-db-create-q2tvv\" (UID: \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\") " pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.625211 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f847fb-ea4e-4c60-82ab-8401eb7bf256-operator-scripts\") pod \"keystone-db-create-q2tvv\" (UID: \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\") " pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.637536 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0a95-account-create-update-2rjzw"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.639154 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.641431 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.647642 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a95-account-create-update-2rjzw"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.726584 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f847fb-ea4e-4c60-82ab-8401eb7bf256-operator-scripts\") pod \"keystone-db-create-q2tvv\" (UID: \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\") " pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.726635 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47b56aea-4152-4110-8c73-02754afa2807-operator-scripts\") pod \"keystone-0a95-account-create-update-2rjzw\" (UID: \"47b56aea-4152-4110-8c73-02754afa2807\") " pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.726671 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pprgs\" (UniqueName: \"kubernetes.io/projected/47b56aea-4152-4110-8c73-02754afa2807-kube-api-access-pprgs\") pod \"keystone-0a95-account-create-update-2rjzw\" (UID: \"47b56aea-4152-4110-8c73-02754afa2807\") " pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.726830 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctb8s\" (UniqueName: \"kubernetes.io/projected/51f847fb-ea4e-4c60-82ab-8401eb7bf256-kube-api-access-ctb8s\") pod \"keystone-db-create-q2tvv\" (UID: \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\") " pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.729001 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f847fb-ea4e-4c60-82ab-8401eb7bf256-operator-scripts\") pod \"keystone-db-create-q2tvv\" (UID: \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\") " pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.744547 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-f7btr"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.749267 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctb8s\" (UniqueName: \"kubernetes.io/projected/51f847fb-ea4e-4c60-82ab-8401eb7bf256-kube-api-access-ctb8s\") pod \"keystone-db-create-q2tvv\" (UID: \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\") " pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:34 crc kubenswrapper[5008]: W0318 18:22:34.751584 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ad5b158_154c_4219_8a1d_d6df23e11d42.slice/crio-37fb58c861448a45f1399524066b90cf7b3005a5fbcb37467f2f6154d12b660d WatchSource:0}: Error finding container 37fb58c861448a45f1399524066b90cf7b3005a5fbcb37467f2f6154d12b660d: Status 404 returned error can't find the container with id 37fb58c861448a45f1399524066b90cf7b3005a5fbcb37467f2f6154d12b660d Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.828852 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47b56aea-4152-4110-8c73-02754afa2807-operator-scripts\") pod \"keystone-0a95-account-create-update-2rjzw\" (UID: \"47b56aea-4152-4110-8c73-02754afa2807\") " pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.829284 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pprgs\" (UniqueName: \"kubernetes.io/projected/47b56aea-4152-4110-8c73-02754afa2807-kube-api-access-pprgs\") pod \"keystone-0a95-account-create-update-2rjzw\" (UID: \"47b56aea-4152-4110-8c73-02754afa2807\") " pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.829325 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:34 crc kubenswrapper[5008]: E0318 18:22:34.829536 5008 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 18:22:34 crc kubenswrapper[5008]: E0318 18:22:34.829575 5008 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 18:22:34 crc kubenswrapper[5008]: E0318 18:22:34.829630 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift podName:fcb3859a-2fc0-4479-a59d-7888246899a9 nodeName:}" failed. No retries permitted until 2026-03-18 18:22:42.829609759 +0000 UTC m=+1219.349082858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift") pod "swift-storage-0" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9") : configmap "swift-ring-files" not found Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.830781 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47b56aea-4152-4110-8c73-02754afa2807-operator-scripts\") pod \"keystone-0a95-account-create-update-2rjzw\" (UID: \"47b56aea-4152-4110-8c73-02754afa2807\") " pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.836412 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-klfwj"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.837439 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-klfwj" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.849617 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-f7btr" event={"ID":"4ad5b158-154c-4219-8a1d-d6df23e11d42","Type":"ContainerStarted","Data":"37fb58c861448a45f1399524066b90cf7b3005a5fbcb37467f2f6154d12b660d"} Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.851773 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pprgs\" (UniqueName: \"kubernetes.io/projected/47b56aea-4152-4110-8c73-02754afa2807-kube-api-access-pprgs\") pod \"keystone-0a95-account-create-update-2rjzw\" (UID: \"47b56aea-4152-4110-8c73-02754afa2807\") " pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.857670 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-34da-account-create-update-m48lt"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.858787 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.862248 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.865166 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-klfwj"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.876071 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.885969 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-835c-account-create-update-6znqk"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.898489 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-34da-account-create-update-m48lt"] Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.931142 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-operator-scripts\") pod \"placement-34da-account-create-update-m48lt\" (UID: \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\") " pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.931199 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z9tx\" (UniqueName: \"kubernetes.io/projected/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-kube-api-access-5z9tx\") pod \"placement-34da-account-create-update-m48lt\" (UID: \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\") " pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.931242 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0649a6ec-1562-4617-9af7-0dafa2e201eb-operator-scripts\") pod \"placement-db-create-klfwj\" (UID: \"0649a6ec-1562-4617-9af7-0dafa2e201eb\") " pod="openstack/placement-db-create-klfwj" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.931322 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btjzt\" (UniqueName: \"kubernetes.io/projected/0649a6ec-1562-4617-9af7-0dafa2e201eb-kube-api-access-btjzt\") pod \"placement-db-create-klfwj\" (UID: \"0649a6ec-1562-4617-9af7-0dafa2e201eb\") " pod="openstack/placement-db-create-klfwj" Mar 18 18:22:34 crc kubenswrapper[5008]: I0318 18:22:34.956869 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.032778 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-operator-scripts\") pod \"placement-34da-account-create-update-m48lt\" (UID: \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\") " pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.032899 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z9tx\" (UniqueName: \"kubernetes.io/projected/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-kube-api-access-5z9tx\") pod \"placement-34da-account-create-update-m48lt\" (UID: \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\") " pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.032990 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0649a6ec-1562-4617-9af7-0dafa2e201eb-operator-scripts\") pod \"placement-db-create-klfwj\" (UID: \"0649a6ec-1562-4617-9af7-0dafa2e201eb\") " pod="openstack/placement-db-create-klfwj" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.033138 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btjzt\" (UniqueName: \"kubernetes.io/projected/0649a6ec-1562-4617-9af7-0dafa2e201eb-kube-api-access-btjzt\") pod \"placement-db-create-klfwj\" (UID: \"0649a6ec-1562-4617-9af7-0dafa2e201eb\") " pod="openstack/placement-db-create-klfwj" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.036927 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-operator-scripts\") pod \"placement-34da-account-create-update-m48lt\" (UID: \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\") " pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.037276 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0649a6ec-1562-4617-9af7-0dafa2e201eb-operator-scripts\") pod \"placement-db-create-klfwj\" (UID: \"0649a6ec-1562-4617-9af7-0dafa2e201eb\") " pod="openstack/placement-db-create-klfwj" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.051697 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btjzt\" (UniqueName: \"kubernetes.io/projected/0649a6ec-1562-4617-9af7-0dafa2e201eb-kube-api-access-btjzt\") pod \"placement-db-create-klfwj\" (UID: \"0649a6ec-1562-4617-9af7-0dafa2e201eb\") " pod="openstack/placement-db-create-klfwj" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.051736 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z9tx\" (UniqueName: \"kubernetes.io/projected/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-kube-api-access-5z9tx\") pod \"placement-34da-account-create-update-m48lt\" (UID: \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\") " pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.175927 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-klfwj" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.186697 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.259708 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.341754 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9bfaa97-2440-4ee9-8e14-893f6ad81460-operator-scripts\") pod \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\" (UID: \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\") " Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.341841 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nm26\" (UniqueName: \"kubernetes.io/projected/b9bfaa97-2440-4ee9-8e14-893f6ad81460-kube-api-access-4nm26\") pod \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\" (UID: \"b9bfaa97-2440-4ee9-8e14-893f6ad81460\") " Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.344131 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bfaa97-2440-4ee9-8e14-893f6ad81460-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9bfaa97-2440-4ee9-8e14-893f6ad81460" (UID: "b9bfaa97-2440-4ee9-8e14-893f6ad81460"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.348737 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bfaa97-2440-4ee9-8e14-893f6ad81460-kube-api-access-4nm26" (OuterVolumeSpecName: "kube-api-access-4nm26") pod "b9bfaa97-2440-4ee9-8e14-893f6ad81460" (UID: "b9bfaa97-2440-4ee9-8e14-893f6ad81460"). InnerVolumeSpecName "kube-api-access-4nm26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.418350 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q2tvv"] Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.444262 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nm26\" (UniqueName: \"kubernetes.io/projected/b9bfaa97-2440-4ee9-8e14-893f6ad81460-kube-api-access-4nm26\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.444641 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9bfaa97-2440-4ee9-8e14-893f6ad81460-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.500267 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a95-account-create-update-2rjzw"] Mar 18 18:22:35 crc kubenswrapper[5008]: W0318 18:22:35.508090 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47b56aea_4152_4110_8c73_02754afa2807.slice/crio-c92eab831cacb6b15410880cf0b774a211c439a362b47c2b7162959a0128eee1 WatchSource:0}: Error finding container c92eab831cacb6b15410880cf0b774a211c439a362b47c2b7162959a0128eee1: Status 404 returned error can't find the container with id c92eab831cacb6b15410880cf0b774a211c439a362b47c2b7162959a0128eee1 Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.632992 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.805968 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-klfwj"] Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.834499 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-34da-account-create-update-m48lt"] Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.879013 5008 generic.go:334] "Generic (PLEG): container finished" podID="4ad5b158-154c-4219-8a1d-d6df23e11d42" containerID="aa4c03c5ba72d29132d0cc211fd855cde6fba56b8c3be090f1c3efa9bd536ac1" exitCode=0 Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.879078 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-f7btr" event={"ID":"4ad5b158-154c-4219-8a1d-d6df23e11d42","Type":"ContainerDied","Data":"aa4c03c5ba72d29132d0cc211fd855cde6fba56b8c3be090f1c3efa9bd536ac1"} Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.910838 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q2tvv" event={"ID":"51f847fb-ea4e-4c60-82ab-8401eb7bf256","Type":"ContainerStarted","Data":"5bdab6b60b219776cab7681033a182c7213dc8d2bd5bf154ccb49793888a5a7a"} Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.911381 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q2tvv" event={"ID":"51f847fb-ea4e-4c60-82ab-8401eb7bf256","Type":"ContainerStarted","Data":"110e964bc8bce76ff55020b3a81f00f17953f9419c6c1911bf274cb0ee09b5cd"} Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.918690 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-34da-account-create-update-m48lt" event={"ID":"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f","Type":"ContainerStarted","Data":"ba1310dc5e744ba5982f54e929347a7f560b0919b2c47127bc7b0617ff7067d9"} Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.954841 5008 generic.go:334] "Generic (PLEG): container finished" podID="4c40e7e7-0a01-41e4-a1e0-b30f415be2d2" containerID="798be6ece7d6cf52f11cd5de0dcd46ad44365eb13feba7934e2b9d594db9fe52" exitCode=0 Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.954951 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-835c-account-create-update-6znqk" event={"ID":"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2","Type":"ContainerDied","Data":"798be6ece7d6cf52f11cd5de0dcd46ad44365eb13feba7934e2b9d594db9fe52"} Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.954978 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-835c-account-create-update-6znqk" event={"ID":"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2","Type":"ContainerStarted","Data":"5473b13374378ceca68d3ec83dbe8567ba46d590f91c6419e2807fed620241e1"} Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.978601 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-q2tvv" podStartSLOduration=1.978579548 podStartE2EDuration="1.978579548s" podCreationTimestamp="2026-03-18 18:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:35.931170333 +0000 UTC m=+1212.450643412" watchObservedRunningTime="2026-03-18 18:22:35.978579548 +0000 UTC m=+1212.498052627" Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.983756 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a95-account-create-update-2rjzw" event={"ID":"47b56aea-4152-4110-8c73-02754afa2807","Type":"ContainerStarted","Data":"321d03f647f7f12ac4e9aa811c21fba6a9c727553aff1c409ce57b0b62f6e45a"} Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.983810 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a95-account-create-update-2rjzw" event={"ID":"47b56aea-4152-4110-8c73-02754afa2807","Type":"ContainerStarted","Data":"c92eab831cacb6b15410880cf0b774a211c439a362b47c2b7162959a0128eee1"} Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.999654 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b5z8q" event={"ID":"b9bfaa97-2440-4ee9-8e14-893f6ad81460","Type":"ContainerDied","Data":"3bd474cfb131675234b2ba93ece625d1ee9adbcb2f40647fdc0bd17783a4f317"} Mar 18 18:22:35 crc kubenswrapper[5008]: I0318 18:22:35.999834 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bd474cfb131675234b2ba93ece625d1ee9adbcb2f40647fdc0bd17783a4f317" Mar 18 18:22:36 crc kubenswrapper[5008]: I0318 18:22:36.000007 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b5z8q" Mar 18 18:22:36 crc kubenswrapper[5008]: I0318 18:22:36.009911 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-klfwj" event={"ID":"0649a6ec-1562-4617-9af7-0dafa2e201eb","Type":"ContainerStarted","Data":"8b075982dc05ca321697a0a0e5ef5f6719bf1d6e0a425f3bbd651d511fe2b8bd"} Mar 18 18:22:36 crc kubenswrapper[5008]: I0318 18:22:36.013343 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-0a95-account-create-update-2rjzw" podStartSLOduration=2.013326766 podStartE2EDuration="2.013326766s" podCreationTimestamp="2026-03-18 18:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:36.009465539 +0000 UTC m=+1212.528938618" watchObservedRunningTime="2026-03-18 18:22:36.013326766 +0000 UTC m=+1212.532799845" Mar 18 18:22:36 crc kubenswrapper[5008]: I0318 18:22:36.066696 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:36 crc kubenswrapper[5008]: I0318 18:22:36.117571 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:22:36 crc kubenswrapper[5008]: I0318 18:22:36.176884 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-94sv2"] Mar 18 18:22:36 crc kubenswrapper[5008]: I0318 18:22:36.704579 5008 scope.go:117] "RemoveContainer" containerID="9d7e5ba332e27f2ccb008910b992f9726d91d3d9540fae35b5488a12d7d5acb1" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.019519 5008 generic.go:334] "Generic (PLEG): container finished" podID="51f847fb-ea4e-4c60-82ab-8401eb7bf256" containerID="5bdab6b60b219776cab7681033a182c7213dc8d2bd5bf154ccb49793888a5a7a" exitCode=0 Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.019615 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q2tvv" event={"ID":"51f847fb-ea4e-4c60-82ab-8401eb7bf256","Type":"ContainerDied","Data":"5bdab6b60b219776cab7681033a182c7213dc8d2bd5bf154ccb49793888a5a7a"} Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.021189 5008 generic.go:334] "Generic (PLEG): container finished" podID="0649a6ec-1562-4617-9af7-0dafa2e201eb" containerID="43680a431967dbc17226617faa158bead9bd647ac537a3bcf95125106214b4b9" exitCode=0 Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.021259 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-klfwj" event={"ID":"0649a6ec-1562-4617-9af7-0dafa2e201eb","Type":"ContainerDied","Data":"43680a431967dbc17226617faa158bead9bd647ac537a3bcf95125106214b4b9"} Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.023059 5008 generic.go:334] "Generic (PLEG): container finished" podID="e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f" containerID="be610e0953ab45ee488b558d8bb0e6d94fda507475baed0a207f7d1beaa99d7f" exitCode=0 Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.023133 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-34da-account-create-update-m48lt" event={"ID":"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f","Type":"ContainerDied","Data":"be610e0953ab45ee488b558d8bb0e6d94fda507475baed0a207f7d1beaa99d7f"} Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.024612 5008 generic.go:334] "Generic (PLEG): container finished" podID="47b56aea-4152-4110-8c73-02754afa2807" containerID="321d03f647f7f12ac4e9aa811c21fba6a9c727553aff1c409ce57b0b62f6e45a" exitCode=0 Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.024671 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a95-account-create-update-2rjzw" event={"ID":"47b56aea-4152-4110-8c73-02754afa2807","Type":"ContainerDied","Data":"321d03f647f7f12ac4e9aa811c21fba6a9c727553aff1c409ce57b0b62f6e45a"} Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.024946 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" podUID="b622014a-11bc-48b9-9960-08670363a6a5" containerName="dnsmasq-dns" containerID="cri-o://ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca" gracePeriod=10 Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.388247 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f7btr" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.466589 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.490028 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad5b158-154c-4219-8a1d-d6df23e11d42-operator-scripts\") pod \"4ad5b158-154c-4219-8a1d-d6df23e11d42\" (UID: \"4ad5b158-154c-4219-8a1d-d6df23e11d42\") " Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.490092 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6drr\" (UniqueName: \"kubernetes.io/projected/4ad5b158-154c-4219-8a1d-d6df23e11d42-kube-api-access-c6drr\") pod \"4ad5b158-154c-4219-8a1d-d6df23e11d42\" (UID: \"4ad5b158-154c-4219-8a1d-d6df23e11d42\") " Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.490978 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ad5b158-154c-4219-8a1d-d6df23e11d42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ad5b158-154c-4219-8a1d-d6df23e11d42" (UID: "4ad5b158-154c-4219-8a1d-d6df23e11d42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.497024 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad5b158-154c-4219-8a1d-d6df23e11d42-kube-api-access-c6drr" (OuterVolumeSpecName: "kube-api-access-c6drr") pod "4ad5b158-154c-4219-8a1d-d6df23e11d42" (UID: "4ad5b158-154c-4219-8a1d-d6df23e11d42"). InnerVolumeSpecName "kube-api-access-c6drr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.528262 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.591869 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-dns-svc\") pod \"b622014a-11bc-48b9-9960-08670363a6a5\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.591962 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shxm7\" (UniqueName: \"kubernetes.io/projected/b622014a-11bc-48b9-9960-08670363a6a5-kube-api-access-shxm7\") pod \"b622014a-11bc-48b9-9960-08670363a6a5\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.591979 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-config\") pod \"b622014a-11bc-48b9-9960-08670363a6a5\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.591994 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-nb\") pod \"b622014a-11bc-48b9-9960-08670363a6a5\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.592043 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-sb\") pod \"b622014a-11bc-48b9-9960-08670363a6a5\" (UID: \"b622014a-11bc-48b9-9960-08670363a6a5\") " Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.592138 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp7fz\" (UniqueName: \"kubernetes.io/projected/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-kube-api-access-gp7fz\") pod \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\" (UID: \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\") " Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.592157 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-operator-scripts\") pod \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\" (UID: \"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2\") " Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.592450 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad5b158-154c-4219-8a1d-d6df23e11d42-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.592466 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6drr\" (UniqueName: \"kubernetes.io/projected/4ad5b158-154c-4219-8a1d-d6df23e11d42-kube-api-access-c6drr\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.592940 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c40e7e7-0a01-41e4-a1e0-b30f415be2d2" (UID: "4c40e7e7-0a01-41e4-a1e0-b30f415be2d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.605694 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b622014a-11bc-48b9-9960-08670363a6a5-kube-api-access-shxm7" (OuterVolumeSpecName: "kube-api-access-shxm7") pod "b622014a-11bc-48b9-9960-08670363a6a5" (UID: "b622014a-11bc-48b9-9960-08670363a6a5"). InnerVolumeSpecName "kube-api-access-shxm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.610628 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-kube-api-access-gp7fz" (OuterVolumeSpecName: "kube-api-access-gp7fz") pod "4c40e7e7-0a01-41e4-a1e0-b30f415be2d2" (UID: "4c40e7e7-0a01-41e4-a1e0-b30f415be2d2"). InnerVolumeSpecName "kube-api-access-gp7fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.650267 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b622014a-11bc-48b9-9960-08670363a6a5" (UID: "b622014a-11bc-48b9-9960-08670363a6a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.653666 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b622014a-11bc-48b9-9960-08670363a6a5" (UID: "b622014a-11bc-48b9-9960-08670363a6a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.655383 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b622014a-11bc-48b9-9960-08670363a6a5" (UID: "b622014a-11bc-48b9-9960-08670363a6a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.660150 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-config" (OuterVolumeSpecName: "config") pod "b622014a-11bc-48b9-9960-08670363a6a5" (UID: "b622014a-11bc-48b9-9960-08670363a6a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.694437 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.694490 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shxm7\" (UniqueName: \"kubernetes.io/projected/b622014a-11bc-48b9-9960-08670363a6a5-kube-api-access-shxm7\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.694505 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.694517 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.694526 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b622014a-11bc-48b9-9960-08670363a6a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.694537 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp7fz\" (UniqueName: \"kubernetes.io/projected/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-kube-api-access-gp7fz\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:37 crc kubenswrapper[5008]: I0318 18:22:37.694547 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.037111 5008 generic.go:334] "Generic (PLEG): container finished" podID="b622014a-11bc-48b9-9960-08670363a6a5" containerID="ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca" exitCode=0 Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.037194 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" event={"ID":"b622014a-11bc-48b9-9960-08670363a6a5","Type":"ContainerDied","Data":"ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca"} Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.037224 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" event={"ID":"b622014a-11bc-48b9-9960-08670363a6a5","Type":"ContainerDied","Data":"39a4e62c66a3b4d505c4a1218f5a2a86ca355d1f6796c0dc86a531cfb57727b0"} Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.037252 5008 scope.go:117] "RemoveContainer" containerID="ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.037378 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-94sv2" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.042545 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-835c-account-create-update-6znqk" event={"ID":"4c40e7e7-0a01-41e4-a1e0-b30f415be2d2","Type":"ContainerDied","Data":"5473b13374378ceca68d3ec83dbe8567ba46d590f91c6419e2807fed620241e1"} Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.042587 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-835c-account-create-update-6znqk" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.042598 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5473b13374378ceca68d3ec83dbe8567ba46d590f91c6419e2807fed620241e1" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.047799 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f7btr" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.047841 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-f7btr" event={"ID":"4ad5b158-154c-4219-8a1d-d6df23e11d42","Type":"ContainerDied","Data":"37fb58c861448a45f1399524066b90cf7b3005a5fbcb37467f2f6154d12b660d"} Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.047864 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37fb58c861448a45f1399524066b90cf7b3005a5fbcb37467f2f6154d12b660d" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.094634 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-94sv2"] Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.102580 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-94sv2"] Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.106442 5008 scope.go:117] "RemoveContainer" containerID="4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.143068 5008 scope.go:117] "RemoveContainer" containerID="ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca" Mar 18 18:22:38 crc kubenswrapper[5008]: E0318 18:22:38.146207 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca\": container with ID starting with ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca not found: ID does not exist" containerID="ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.146252 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca"} err="failed to get container status \"ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca\": rpc error: code = NotFound desc = could not find container \"ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca\": container with ID starting with ac0f293b965c3084d260e581b2eaa344384d9f5e8dd2b3d3b4daf5a57d2f23ca not found: ID does not exist" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.146280 5008 scope.go:117] "RemoveContainer" containerID="4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c" Mar 18 18:22:38 crc kubenswrapper[5008]: E0318 18:22:38.146945 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c\": container with ID starting with 4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c not found: ID does not exist" containerID="4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.146999 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c"} err="failed to get container status \"4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c\": rpc error: code = NotFound desc = could not find container \"4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c\": container with ID starting with 4dcdeaa3ca31e2a61905cb29d49ad6db81f53ef85392789a0b1eaeccf429ba3c not found: ID does not exist" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.218283 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b622014a-11bc-48b9-9960-08670363a6a5" path="/var/lib/kubelet/pods/b622014a-11bc-48b9-9960-08670363a6a5/volumes" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.448203 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.507499 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z9tx\" (UniqueName: \"kubernetes.io/projected/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-kube-api-access-5z9tx\") pod \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\" (UID: \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\") " Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.507655 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-operator-scripts\") pod \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\" (UID: \"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f\") " Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.508840 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f" (UID: "e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.513291 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-kube-api-access-5z9tx" (OuterVolumeSpecName: "kube-api-access-5z9tx") pod "e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f" (UID: "e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f"). InnerVolumeSpecName "kube-api-access-5z9tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.565114 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.566547 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.576818 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-klfwj" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.610236 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pprgs\" (UniqueName: \"kubernetes.io/projected/47b56aea-4152-4110-8c73-02754afa2807-kube-api-access-pprgs\") pod \"47b56aea-4152-4110-8c73-02754afa2807\" (UID: \"47b56aea-4152-4110-8c73-02754afa2807\") " Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.610372 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btjzt\" (UniqueName: \"kubernetes.io/projected/0649a6ec-1562-4617-9af7-0dafa2e201eb-kube-api-access-btjzt\") pod \"0649a6ec-1562-4617-9af7-0dafa2e201eb\" (UID: \"0649a6ec-1562-4617-9af7-0dafa2e201eb\") " Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.610403 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f847fb-ea4e-4c60-82ab-8401eb7bf256-operator-scripts\") pod \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\" (UID: \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\") " Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.610420 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctb8s\" (UniqueName: \"kubernetes.io/projected/51f847fb-ea4e-4c60-82ab-8401eb7bf256-kube-api-access-ctb8s\") pod \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\" (UID: \"51f847fb-ea4e-4c60-82ab-8401eb7bf256\") " Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.610546 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47b56aea-4152-4110-8c73-02754afa2807-operator-scripts\") pod \"47b56aea-4152-4110-8c73-02754afa2807\" (UID: \"47b56aea-4152-4110-8c73-02754afa2807\") " Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.610623 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0649a6ec-1562-4617-9af7-0dafa2e201eb-operator-scripts\") pod \"0649a6ec-1562-4617-9af7-0dafa2e201eb\" (UID: \"0649a6ec-1562-4617-9af7-0dafa2e201eb\") " Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.611592 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z9tx\" (UniqueName: \"kubernetes.io/projected/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-kube-api-access-5z9tx\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.611617 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.611701 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47b56aea-4152-4110-8c73-02754afa2807-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47b56aea-4152-4110-8c73-02754afa2807" (UID: "47b56aea-4152-4110-8c73-02754afa2807"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.612443 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51f847fb-ea4e-4c60-82ab-8401eb7bf256-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51f847fb-ea4e-4c60-82ab-8401eb7bf256" (UID: "51f847fb-ea4e-4c60-82ab-8401eb7bf256"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.612514 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0649a6ec-1562-4617-9af7-0dafa2e201eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0649a6ec-1562-4617-9af7-0dafa2e201eb" (UID: "0649a6ec-1562-4617-9af7-0dafa2e201eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.619794 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b56aea-4152-4110-8c73-02754afa2807-kube-api-access-pprgs" (OuterVolumeSpecName: "kube-api-access-pprgs") pod "47b56aea-4152-4110-8c73-02754afa2807" (UID: "47b56aea-4152-4110-8c73-02754afa2807"). InnerVolumeSpecName "kube-api-access-pprgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.620760 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0649a6ec-1562-4617-9af7-0dafa2e201eb-kube-api-access-btjzt" (OuterVolumeSpecName: "kube-api-access-btjzt") pod "0649a6ec-1562-4617-9af7-0dafa2e201eb" (UID: "0649a6ec-1562-4617-9af7-0dafa2e201eb"). InnerVolumeSpecName "kube-api-access-btjzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.625497 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f847fb-ea4e-4c60-82ab-8401eb7bf256-kube-api-access-ctb8s" (OuterVolumeSpecName: "kube-api-access-ctb8s") pod "51f847fb-ea4e-4c60-82ab-8401eb7bf256" (UID: "51f847fb-ea4e-4c60-82ab-8401eb7bf256"). InnerVolumeSpecName "kube-api-access-ctb8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.713104 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pprgs\" (UniqueName: \"kubernetes.io/projected/47b56aea-4152-4110-8c73-02754afa2807-kube-api-access-pprgs\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.713142 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btjzt\" (UniqueName: \"kubernetes.io/projected/0649a6ec-1562-4617-9af7-0dafa2e201eb-kube-api-access-btjzt\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.713151 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f847fb-ea4e-4c60-82ab-8401eb7bf256-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.713159 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctb8s\" (UniqueName: \"kubernetes.io/projected/51f847fb-ea4e-4c60-82ab-8401eb7bf256-kube-api-access-ctb8s\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.713168 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47b56aea-4152-4110-8c73-02754afa2807-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:38 crc kubenswrapper[5008]: I0318 18:22:38.713176 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0649a6ec-1562-4617-9af7-0dafa2e201eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.056814 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a95-account-create-update-2rjzw" event={"ID":"47b56aea-4152-4110-8c73-02754afa2807","Type":"ContainerDied","Data":"c92eab831cacb6b15410880cf0b774a211c439a362b47c2b7162959a0128eee1"} Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.057694 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c92eab831cacb6b15410880cf0b774a211c439a362b47c2b7162959a0128eee1" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.057846 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a95-account-create-update-2rjzw" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.065117 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-klfwj" event={"ID":"0649a6ec-1562-4617-9af7-0dafa2e201eb","Type":"ContainerDied","Data":"8b075982dc05ca321697a0a0e5ef5f6719bf1d6e0a425f3bbd651d511fe2b8bd"} Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.065165 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b075982dc05ca321697a0a0e5ef5f6719bf1d6e0a425f3bbd651d511fe2b8bd" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.065216 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-klfwj" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.067036 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q2tvv" event={"ID":"51f847fb-ea4e-4c60-82ab-8401eb7bf256","Type":"ContainerDied","Data":"110e964bc8bce76ff55020b3a81f00f17953f9419c6c1911bf274cb0ee09b5cd"} Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.067077 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="110e964bc8bce76ff55020b3a81f00f17953f9419c6c1911bf274cb0ee09b5cd" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.067133 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q2tvv" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.072057 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-34da-account-create-update-m48lt" event={"ID":"e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f","Type":"ContainerDied","Data":"ba1310dc5e744ba5982f54e929347a7f560b0919b2c47127bc7b0617ff7067d9"} Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.072101 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1310dc5e744ba5982f54e929347a7f560b0919b2c47127bc7b0617ff7067d9" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.072110 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-34da-account-create-update-m48lt" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.140299 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jqd2z"] Mar 18 18:22:39 crc kubenswrapper[5008]: E0318 18:22:39.140784 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f847fb-ea4e-4c60-82ab-8401eb7bf256" containerName="mariadb-database-create" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.140799 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f847fb-ea4e-4c60-82ab-8401eb7bf256" containerName="mariadb-database-create" Mar 18 18:22:39 crc kubenswrapper[5008]: E0318 18:22:39.140807 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0649a6ec-1562-4617-9af7-0dafa2e201eb" containerName="mariadb-database-create" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.140815 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0649a6ec-1562-4617-9af7-0dafa2e201eb" containerName="mariadb-database-create" Mar 18 18:22:39 crc kubenswrapper[5008]: E0318 18:22:39.140836 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.140842 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: E0318 18:22:39.140856 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b56aea-4152-4110-8c73-02754afa2807" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.140862 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b56aea-4152-4110-8c73-02754afa2807" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: E0318 18:22:39.140877 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b622014a-11bc-48b9-9960-08670363a6a5" containerName="init" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.140884 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b622014a-11bc-48b9-9960-08670363a6a5" containerName="init" Mar 18 18:22:39 crc kubenswrapper[5008]: E0318 18:22:39.142079 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bfaa97-2440-4ee9-8e14-893f6ad81460" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142095 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bfaa97-2440-4ee9-8e14-893f6ad81460" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: E0318 18:22:39.142109 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad5b158-154c-4219-8a1d-d6df23e11d42" containerName="mariadb-database-create" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142116 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad5b158-154c-4219-8a1d-d6df23e11d42" containerName="mariadb-database-create" Mar 18 18:22:39 crc kubenswrapper[5008]: E0318 18:22:39.142131 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b622014a-11bc-48b9-9960-08670363a6a5" containerName="dnsmasq-dns" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142144 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b622014a-11bc-48b9-9960-08670363a6a5" containerName="dnsmasq-dns" Mar 18 18:22:39 crc kubenswrapper[5008]: E0318 18:22:39.142151 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c40e7e7-0a01-41e4-a1e0-b30f415be2d2" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142157 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c40e7e7-0a01-41e4-a1e0-b30f415be2d2" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142304 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f847fb-ea4e-4c60-82ab-8401eb7bf256" containerName="mariadb-database-create" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142317 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9bfaa97-2440-4ee9-8e14-893f6ad81460" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142328 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c40e7e7-0a01-41e4-a1e0-b30f415be2d2" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142343 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad5b158-154c-4219-8a1d-d6df23e11d42" containerName="mariadb-database-create" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142350 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b622014a-11bc-48b9-9960-08670363a6a5" containerName="dnsmasq-dns" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142356 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b56aea-4152-4110-8c73-02754afa2807" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142364 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f" containerName="mariadb-account-create-update" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.142372 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0649a6ec-1562-4617-9af7-0dafa2e201eb" containerName="mariadb-database-create" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.143023 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.144634 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nh2zd" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.146223 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.149698 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jqd2z"] Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.221647 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65qdw\" (UniqueName: \"kubernetes.io/projected/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-kube-api-access-65qdw\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.222100 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-db-sync-config-data\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.222288 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-config-data\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.222430 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-combined-ca-bundle\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.324367 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-config-data\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.324655 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-combined-ca-bundle\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.324902 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65qdw\" (UniqueName: \"kubernetes.io/projected/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-kube-api-access-65qdw\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.325050 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-db-sync-config-data\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.328747 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-config-data\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.329845 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-combined-ca-bundle\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.330854 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-db-sync-config-data\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.342143 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65qdw\" (UniqueName: \"kubernetes.io/projected/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-kube-api-access-65qdw\") pod \"glance-db-sync-jqd2z\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:39 crc kubenswrapper[5008]: I0318 18:22:39.456027 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jqd2z" Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.126391 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jqd2z"] Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.471348 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-b5z8q"] Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.476435 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-b5z8q"] Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.491750 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8wr5b"] Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.492840 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.497647 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.498977 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8wr5b"] Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.548540 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sgxb\" (UniqueName: \"kubernetes.io/projected/f6bef0b8-65da-409d-967a-5b49a28835d3-kube-api-access-8sgxb\") pod \"root-account-create-update-8wr5b\" (UID: \"f6bef0b8-65da-409d-967a-5b49a28835d3\") " pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.548758 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6bef0b8-65da-409d-967a-5b49a28835d3-operator-scripts\") pod \"root-account-create-update-8wr5b\" (UID: \"f6bef0b8-65da-409d-967a-5b49a28835d3\") " pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.649882 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6bef0b8-65da-409d-967a-5b49a28835d3-operator-scripts\") pod \"root-account-create-update-8wr5b\" (UID: \"f6bef0b8-65da-409d-967a-5b49a28835d3\") " pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.650278 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sgxb\" (UniqueName: \"kubernetes.io/projected/f6bef0b8-65da-409d-967a-5b49a28835d3-kube-api-access-8sgxb\") pod \"root-account-create-update-8wr5b\" (UID: \"f6bef0b8-65da-409d-967a-5b49a28835d3\") " pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.650636 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6bef0b8-65da-409d-967a-5b49a28835d3-operator-scripts\") pod \"root-account-create-update-8wr5b\" (UID: \"f6bef0b8-65da-409d-967a-5b49a28835d3\") " pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.675341 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sgxb\" (UniqueName: \"kubernetes.io/projected/f6bef0b8-65da-409d-967a-5b49a28835d3-kube-api-access-8sgxb\") pod \"root-account-create-update-8wr5b\" (UID: \"f6bef0b8-65da-409d-967a-5b49a28835d3\") " pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:40 crc kubenswrapper[5008]: I0318 18:22:40.824186 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:41 crc kubenswrapper[5008]: I0318 18:22:41.096635 5008 generic.go:334] "Generic (PLEG): container finished" podID="a03defc9-9b67-47f0-b87a-ed5345e84c18" containerID="a3defe79165feeaba9c65d446da97fc8a83798ed98fa4748fbf2caaeb30f67d5" exitCode=0 Mar 18 18:22:41 crc kubenswrapper[5008]: I0318 18:22:41.096707 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-klvjh" event={"ID":"a03defc9-9b67-47f0-b87a-ed5345e84c18","Type":"ContainerDied","Data":"a3defe79165feeaba9c65d446da97fc8a83798ed98fa4748fbf2caaeb30f67d5"} Mar 18 18:22:41 crc kubenswrapper[5008]: I0318 18:22:41.098165 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jqd2z" event={"ID":"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3","Type":"ContainerStarted","Data":"cc5f2900c4b65f84bbe0433f7c5a1fe63cbd4312aec266d08b31fff0b4899c95"} Mar 18 18:22:41 crc kubenswrapper[5008]: I0318 18:22:41.277258 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8wr5b"] Mar 18 18:22:41 crc kubenswrapper[5008]: W0318 18:22:41.284318 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6bef0b8_65da_409d_967a_5b49a28835d3.slice/crio-599f5c51334d8a45951c9f33d25372fbc492470fc87d2ec091f3584680138055 WatchSource:0}: Error finding container 599f5c51334d8a45951c9f33d25372fbc492470fc87d2ec091f3584680138055: Status 404 returned error can't find the container with id 599f5c51334d8a45951c9f33d25372fbc492470fc87d2ec091f3584680138055 Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.106103 5008 generic.go:334] "Generic (PLEG): container finished" podID="f6bef0b8-65da-409d-967a-5b49a28835d3" containerID="16c0e14b3ff7895a9c2c9c4b897294a5ac7ce1bd99ff7ff4b5ed6bef900c7565" exitCode=0 Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.106480 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8wr5b" event={"ID":"f6bef0b8-65da-409d-967a-5b49a28835d3","Type":"ContainerDied","Data":"16c0e14b3ff7895a9c2c9c4b897294a5ac7ce1bd99ff7ff4b5ed6bef900c7565"} Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.106504 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8wr5b" event={"ID":"f6bef0b8-65da-409d-967a-5b49a28835d3","Type":"ContainerStarted","Data":"599f5c51334d8a45951c9f33d25372fbc492470fc87d2ec091f3584680138055"} Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.215820 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bfaa97-2440-4ee9-8e14-893f6ad81460" path="/var/lib/kubelet/pods/b9bfaa97-2440-4ee9-8e14-893f6ad81460/volumes" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.448331 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.487371 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85kf2\" (UniqueName: \"kubernetes.io/projected/a03defc9-9b67-47f0-b87a-ed5345e84c18-kube-api-access-85kf2\") pod \"a03defc9-9b67-47f0-b87a-ed5345e84c18\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.487449 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a03defc9-9b67-47f0-b87a-ed5345e84c18-etc-swift\") pod \"a03defc9-9b67-47f0-b87a-ed5345e84c18\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.487494 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-swiftconf\") pod \"a03defc9-9b67-47f0-b87a-ed5345e84c18\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.487525 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-combined-ca-bundle\") pod \"a03defc9-9b67-47f0-b87a-ed5345e84c18\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.488269 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-dispersionconf\") pod \"a03defc9-9b67-47f0-b87a-ed5345e84c18\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.488297 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-ring-data-devices\") pod \"a03defc9-9b67-47f0-b87a-ed5345e84c18\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.488326 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-scripts\") pod \"a03defc9-9b67-47f0-b87a-ed5345e84c18\" (UID: \"a03defc9-9b67-47f0-b87a-ed5345e84c18\") " Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.489053 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a03defc9-9b67-47f0-b87a-ed5345e84c18" (UID: "a03defc9-9b67-47f0-b87a-ed5345e84c18"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.489654 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a03defc9-9b67-47f0-b87a-ed5345e84c18-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a03defc9-9b67-47f0-b87a-ed5345e84c18" (UID: "a03defc9-9b67-47f0-b87a-ed5345e84c18"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.500840 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a03defc9-9b67-47f0-b87a-ed5345e84c18-kube-api-access-85kf2" (OuterVolumeSpecName: "kube-api-access-85kf2") pod "a03defc9-9b67-47f0-b87a-ed5345e84c18" (UID: "a03defc9-9b67-47f0-b87a-ed5345e84c18"). InnerVolumeSpecName "kube-api-access-85kf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.515756 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a03defc9-9b67-47f0-b87a-ed5345e84c18" (UID: "a03defc9-9b67-47f0-b87a-ed5345e84c18"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.516367 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a03defc9-9b67-47f0-b87a-ed5345e84c18" (UID: "a03defc9-9b67-47f0-b87a-ed5345e84c18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.529442 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a03defc9-9b67-47f0-b87a-ed5345e84c18" (UID: "a03defc9-9b67-47f0-b87a-ed5345e84c18"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.545347 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-scripts" (OuterVolumeSpecName: "scripts") pod "a03defc9-9b67-47f0-b87a-ed5345e84c18" (UID: "a03defc9-9b67-47f0-b87a-ed5345e84c18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.590995 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.591175 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85kf2\" (UniqueName: \"kubernetes.io/projected/a03defc9-9b67-47f0-b87a-ed5345e84c18-kube-api-access-85kf2\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.591196 5008 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a03defc9-9b67-47f0-b87a-ed5345e84c18-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.591211 5008 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.591220 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.591233 5008 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a03defc9-9b67-47f0-b87a-ed5345e84c18-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.591244 5008 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a03defc9-9b67-47f0-b87a-ed5345e84c18-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.895415 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.899282 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") pod \"swift-storage-0\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " pod="openstack/swift-storage-0" Mar 18 18:22:42 crc kubenswrapper[5008]: I0318 18:22:42.930322 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:43.116279 5008 generic.go:334] "Generic (PLEG): container finished" podID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" containerID="0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260" exitCode=0 Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:43.116359 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3d5f0191-2702-46ed-ab82-e8c93ec1cf02","Type":"ContainerDied","Data":"0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260"} Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:43.120022 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-klvjh" event={"ID":"a03defc9-9b67-47f0-b87a-ed5345e84c18","Type":"ContainerDied","Data":"5ccd4cc029e26e7b51514cfe4a41a0c95fbb3f0d7d8e9626b4ab9c855e30dab9"} Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:43.120054 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ccd4cc029e26e7b51514cfe4a41a0c95fbb3f0d7d8e9626b4ab9c855e30dab9" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:43.122629 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-klvjh" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.130762 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3d5f0191-2702-46ed-ab82-e8c93ec1cf02","Type":"ContainerStarted","Data":"de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795"} Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.131313 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.160336 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371980.694456 podStartE2EDuration="56.160318789s" podCreationTimestamp="2026-03-18 18:21:48 +0000 UTC" firstStartedPulling="2026-03-18 18:21:50.262600008 +0000 UTC m=+1166.782073087" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:22:44.156019451 +0000 UTC m=+1220.675492530" watchObservedRunningTime="2026-03-18 18:22:44.160318789 +0000 UTC m=+1220.679791868" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.662716 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.730983 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sgxb\" (UniqueName: \"kubernetes.io/projected/f6bef0b8-65da-409d-967a-5b49a28835d3-kube-api-access-8sgxb\") pod \"f6bef0b8-65da-409d-967a-5b49a28835d3\" (UID: \"f6bef0b8-65da-409d-967a-5b49a28835d3\") " Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.731176 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6bef0b8-65da-409d-967a-5b49a28835d3-operator-scripts\") pod \"f6bef0b8-65da-409d-967a-5b49a28835d3\" (UID: \"f6bef0b8-65da-409d-967a-5b49a28835d3\") " Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.731767 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bef0b8-65da-409d-967a-5b49a28835d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6bef0b8-65da-409d-967a-5b49a28835d3" (UID: "f6bef0b8-65da-409d-967a-5b49a28835d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.747866 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bef0b8-65da-409d-967a-5b49a28835d3-kube-api-access-8sgxb" (OuterVolumeSpecName: "kube-api-access-8sgxb") pod "f6bef0b8-65da-409d-967a-5b49a28835d3" (UID: "f6bef0b8-65da-409d-967a-5b49a28835d3"). InnerVolumeSpecName "kube-api-access-8sgxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.832742 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sgxb\" (UniqueName: \"kubernetes.io/projected/f6bef0b8-65da-409d-967a-5b49a28835d3-kube-api-access-8sgxb\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.832798 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6bef0b8-65da-409d-967a-5b49a28835d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:44 crc kubenswrapper[5008]: I0318 18:22:44.855688 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:22:45 crc kubenswrapper[5008]: I0318 18:22:45.148051 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8wr5b" event={"ID":"f6bef0b8-65da-409d-967a-5b49a28835d3","Type":"ContainerDied","Data":"599f5c51334d8a45951c9f33d25372fbc492470fc87d2ec091f3584680138055"} Mar 18 18:22:45 crc kubenswrapper[5008]: I0318 18:22:45.148456 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="599f5c51334d8a45951c9f33d25372fbc492470fc87d2ec091f3584680138055" Mar 18 18:22:45 crc kubenswrapper[5008]: I0318 18:22:45.148157 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8wr5b" Mar 18 18:22:45 crc kubenswrapper[5008]: I0318 18:22:45.150289 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"fc4790e13dff7b48eb97ee34aebdb11f419a797789fcd267c5a32795ed706da6"} Mar 18 18:22:45 crc kubenswrapper[5008]: I0318 18:22:45.153234 5008 generic.go:334] "Generic (PLEG): container finished" podID="b60d757b-db66-46c1-ad92-4a9e591217a0" containerID="52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2" exitCode=0 Mar 18 18:22:45 crc kubenswrapper[5008]: I0318 18:22:45.153363 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b60d757b-db66-46c1-ad92-4a9e591217a0","Type":"ContainerDied","Data":"52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2"} Mar 18 18:22:46 crc kubenswrapper[5008]: I0318 18:22:46.170783 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b60d757b-db66-46c1-ad92-4a9e591217a0","Type":"ContainerStarted","Data":"82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411"} Mar 18 18:22:46 crc kubenswrapper[5008]: I0318 18:22:46.172122 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:22:46 crc kubenswrapper[5008]: I0318 18:22:46.213120 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.887587309 podStartE2EDuration="57.21310053s" podCreationTimestamp="2026-03-18 18:21:49 +0000 UTC" firstStartedPulling="2026-03-18 18:21:55.054669013 +0000 UTC m=+1171.574142092" lastFinishedPulling="2026-03-18 18:22:09.380182234 +0000 UTC m=+1185.899655313" observedRunningTime="2026-03-18 18:22:46.208415263 +0000 UTC m=+1222.727888342" watchObservedRunningTime="2026-03-18 18:22:46.21310053 +0000 UTC m=+1222.732573619" Mar 18 18:22:46 crc kubenswrapper[5008]: I0318 18:22:46.376280 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.287875 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9qcqj" podUID="aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" containerName="ovn-controller" probeResult="failure" output=< Mar 18 18:22:50 crc kubenswrapper[5008]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 18:22:50 crc kubenswrapper[5008]: > Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.312663 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.330177 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.561164 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9qcqj-config-2lsmq"] Mar 18 18:22:50 crc kubenswrapper[5008]: E0318 18:22:50.561551 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bef0b8-65da-409d-967a-5b49a28835d3" containerName="mariadb-account-create-update" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.561587 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bef0b8-65da-409d-967a-5b49a28835d3" containerName="mariadb-account-create-update" Mar 18 18:22:50 crc kubenswrapper[5008]: E0318 18:22:50.561625 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03defc9-9b67-47f0-b87a-ed5345e84c18" containerName="swift-ring-rebalance" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.561635 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03defc9-9b67-47f0-b87a-ed5345e84c18" containerName="swift-ring-rebalance" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.561818 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a03defc9-9b67-47f0-b87a-ed5345e84c18" containerName="swift-ring-rebalance" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.561850 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6bef0b8-65da-409d-967a-5b49a28835d3" containerName="mariadb-account-create-update" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.562455 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.564368 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.577629 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9qcqj-config-2lsmq"] Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.627457 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-log-ovn\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.627499 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-additional-scripts\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.627661 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run-ovn\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.627687 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-scripts\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.627706 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.627722 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54z4w\" (UniqueName: \"kubernetes.io/projected/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-kube-api-access-54z4w\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.729390 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-log-ovn\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.729447 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-additional-scripts\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.729510 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run-ovn\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.729533 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-scripts\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.729599 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.729625 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54z4w\" (UniqueName: \"kubernetes.io/projected/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-kube-api-access-54z4w\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.729874 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-log-ovn\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.729932 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.730045 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run-ovn\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.730939 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-additional-scripts\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.732577 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-scripts\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.760415 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54z4w\" (UniqueName: \"kubernetes.io/projected/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-kube-api-access-54z4w\") pod \"ovn-controller-9qcqj-config-2lsmq\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:50 crc kubenswrapper[5008]: I0318 18:22:50.889818 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:54 crc kubenswrapper[5008]: I0318 18:22:54.275942 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9qcqj-config-2lsmq"] Mar 18 18:22:54 crc kubenswrapper[5008]: W0318 18:22:54.288298 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6e2c6ef_c047_4b85_b686_aab89dbcc7a3.slice/crio-b79fa3d503910a0e63aab333d4bf3b3f6d76bf8d1810b5b7a77e129c1b54c16a WatchSource:0}: Error finding container b79fa3d503910a0e63aab333d4bf3b3f6d76bf8d1810b5b7a77e129c1b54c16a: Status 404 returned error can't find the container with id b79fa3d503910a0e63aab333d4bf3b3f6d76bf8d1810b5b7a77e129c1b54c16a Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.267090 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"b5255bfa8eb99b8162ba17c46557ccb30518ff7df2b8694d473240b663a9ce8c"} Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.267746 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"688589b78817d925eba18cc083d7aae7884af996d5eac87b2f9b8be694e1d743"} Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.267764 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"b75695d9f9722a67c19ee08c21555a403a91fc1e836d2a6b7c94c581c39bc7e8"} Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.267776 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"b2ff7dec8963820747dd167a23cc98ef08104fcb6886f42e0c188e8c2d2b5557"} Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.269001 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jqd2z" event={"ID":"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3","Type":"ContainerStarted","Data":"301939a383694f0855d653da76c5187011b8a736256c7435c14a333754982b7e"} Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.272079 5008 generic.go:334] "Generic (PLEG): container finished" podID="b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" containerID="358a921eea98df73bcc1f5aad28a595444fa783bab3b83f0782f3fbeada80880" exitCode=0 Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.272140 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj-config-2lsmq" event={"ID":"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3","Type":"ContainerDied","Data":"358a921eea98df73bcc1f5aad28a595444fa783bab3b83f0782f3fbeada80880"} Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.272161 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj-config-2lsmq" event={"ID":"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3","Type":"ContainerStarted","Data":"b79fa3d503910a0e63aab333d4bf3b3f6d76bf8d1810b5b7a77e129c1b54c16a"} Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.298741 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jqd2z" podStartSLOduration=2.439392723 podStartE2EDuration="16.298724766s" podCreationTimestamp="2026-03-18 18:22:39 +0000 UTC" firstStartedPulling="2026-03-18 18:22:40.135175145 +0000 UTC m=+1216.654648224" lastFinishedPulling="2026-03-18 18:22:53.994507188 +0000 UTC m=+1230.513980267" observedRunningTime="2026-03-18 18:22:55.285851694 +0000 UTC m=+1231.805324783" watchObservedRunningTime="2026-03-18 18:22:55.298724766 +0000 UTC m=+1231.818197845" Mar 18 18:22:55 crc kubenswrapper[5008]: I0318 18:22:55.303747 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9qcqj" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.645882 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.743234 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run\") pod \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.743521 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-log-ovn\") pod \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.743353 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run" (OuterVolumeSpecName: "var-run") pod "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" (UID: "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.743747 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-additional-scripts\") pod \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.743590 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" (UID: "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.743979 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54z4w\" (UniqueName: \"kubernetes.io/projected/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-kube-api-access-54z4w\") pod \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.744071 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-scripts\") pod \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.744113 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run-ovn\") pod \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\" (UID: \"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3\") " Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.744301 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" (UID: "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.744761 5008 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.744787 5008 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.744800 5008 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.745073 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" (UID: "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.745199 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-scripts" (OuterVolumeSpecName: "scripts") pod "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" (UID: "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.748374 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-kube-api-access-54z4w" (OuterVolumeSpecName: "kube-api-access-54z4w") pod "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" (UID: "b6e2c6ef-c047-4b85-b686-aab89dbcc7a3"). InnerVolumeSpecName "kube-api-access-54z4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.846298 5008 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.846338 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54z4w\" (UniqueName: \"kubernetes.io/projected/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-kube-api-access-54z4w\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:56 crc kubenswrapper[5008]: I0318 18:22:56.846352 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.292694 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"2a06525b664dc560a781b00430903d7869796e656f728e7637b34cc39532a99e"} Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.293013 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"aa11f18a13f730403dae487c0f3224a3b6d6266ab6e0fc1aab36fa0cff77ecb4"} Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.293024 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"21b72f72c110b5cddd921bb4d2588f810988fa4c91525dd72e97c92a5f5d881d"} Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.293034 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"a3d478398fbcc00ebce85e7d90128952489a28ad02808dcc006fc3822c4fdaba"} Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.294309 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj-config-2lsmq" event={"ID":"b6e2c6ef-c047-4b85-b686-aab89dbcc7a3","Type":"ContainerDied","Data":"b79fa3d503910a0e63aab333d4bf3b3f6d76bf8d1810b5b7a77e129c1b54c16a"} Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.294331 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b79fa3d503910a0e63aab333d4bf3b3f6d76bf8d1810b5b7a77e129c1b54c16a" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.294354 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj-config-2lsmq" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.741878 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9qcqj-config-2lsmq"] Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.749821 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9qcqj-config-2lsmq"] Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.860156 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9qcqj-config-knp7f"] Mar 18 18:22:57 crc kubenswrapper[5008]: E0318 18:22:57.860612 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" containerName="ovn-config" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.860633 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" containerName="ovn-config" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.860834 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" containerName="ovn-config" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.861484 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.862760 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.868552 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9qcqj-config-knp7f"] Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.964407 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-scripts\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.964496 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run-ovn\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.964527 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-additional-scripts\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.964586 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.964608 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4jzc\" (UniqueName: \"kubernetes.io/projected/68176642-ba72-42e3-8493-57b042ee7ac8-kube-api-access-h4jzc\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:57 crc kubenswrapper[5008]: I0318 18:22:57.964629 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-log-ovn\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.067115 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.067177 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4jzc\" (UniqueName: \"kubernetes.io/projected/68176642-ba72-42e3-8493-57b042ee7ac8-kube-api-access-h4jzc\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.067213 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-log-ovn\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.067309 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-scripts\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.067447 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run-ovn\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.067480 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.067488 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-additional-scripts\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.067836 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run-ovn\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.067895 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-log-ovn\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.070053 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-scripts\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.077099 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-additional-scripts\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.085690 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4jzc\" (UniqueName: \"kubernetes.io/projected/68176642-ba72-42e3-8493-57b042ee7ac8-kube-api-access-h4jzc\") pod \"ovn-controller-9qcqj-config-knp7f\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.218811 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e2c6ef-c047-4b85-b686-aab89dbcc7a3" path="/var/lib/kubelet/pods/b6e2c6ef-c047-4b85-b686-aab89dbcc7a3/volumes" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.232291 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:22:58 crc kubenswrapper[5008]: I0318 18:22:58.911394 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9qcqj-config-knp7f"] Mar 18 18:22:59 crc kubenswrapper[5008]: I0318 18:22:59.314777 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj-config-knp7f" event={"ID":"68176642-ba72-42e3-8493-57b042ee7ac8","Type":"ContainerStarted","Data":"39c824b90efaad7d881f7adb4b53267bfd4db9871d104ae0ca429fecb399e086"} Mar 18 18:22:59 crc kubenswrapper[5008]: I0318 18:22:59.320030 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"4f571497d968bf39f24266de4994c7de6a2c821baa3ad302407cf536047c662e"} Mar 18 18:22:59 crc kubenswrapper[5008]: I0318 18:22:59.320065 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"3b7732cd3cbc9f6e46f3b52e181285bcbcb64ff5a7d634bf4399f0d57729ef65"} Mar 18 18:22:59 crc kubenswrapper[5008]: I0318 18:22:59.320075 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"0b1ee5c8c45f6646ada20310701b8ec3f99b2a8128a2190acf71a6ef29f4200a"} Mar 18 18:22:59 crc kubenswrapper[5008]: I0318 18:22:59.712793 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.019166 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-968pz"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.020517 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-968pz" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.048866 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-968pz"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.102874 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba33af3d-82b3-406f-9b9d-9511c8e874c1-operator-scripts\") pod \"cinder-db-create-968pz\" (UID: \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\") " pod="openstack/cinder-db-create-968pz" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.102996 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkgtl\" (UniqueName: \"kubernetes.io/projected/ba33af3d-82b3-406f-9b9d-9511c8e874c1-kube-api-access-xkgtl\") pod \"cinder-db-create-968pz\" (UID: \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\") " pod="openstack/cinder-db-create-968pz" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.115065 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3502-account-create-update-m52l9"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.115989 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.120799 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.132249 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3502-account-create-update-m52l9"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.204696 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwbs\" (UniqueName: \"kubernetes.io/projected/82798150-dec0-4bf6-a917-afe9e9ad020d-kube-api-access-ncwbs\") pod \"cinder-3502-account-create-update-m52l9\" (UID: \"82798150-dec0-4bf6-a917-afe9e9ad020d\") " pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.204748 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba33af3d-82b3-406f-9b9d-9511c8e874c1-operator-scripts\") pod \"cinder-db-create-968pz\" (UID: \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\") " pod="openstack/cinder-db-create-968pz" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.204780 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgtl\" (UniqueName: \"kubernetes.io/projected/ba33af3d-82b3-406f-9b9d-9511c8e874c1-kube-api-access-xkgtl\") pod \"cinder-db-create-968pz\" (UID: \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\") " pod="openstack/cinder-db-create-968pz" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.204865 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82798150-dec0-4bf6-a917-afe9e9ad020d-operator-scripts\") pod \"cinder-3502-account-create-update-m52l9\" (UID: \"82798150-dec0-4bf6-a917-afe9e9ad020d\") " pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.207100 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba33af3d-82b3-406f-9b9d-9511c8e874c1-operator-scripts\") pod \"cinder-db-create-968pz\" (UID: \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\") " pod="openstack/cinder-db-create-968pz" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.223790 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkgtl\" (UniqueName: \"kubernetes.io/projected/ba33af3d-82b3-406f-9b9d-9511c8e874c1-kube-api-access-xkgtl\") pod \"cinder-db-create-968pz\" (UID: \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\") " pod="openstack/cinder-db-create-968pz" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.289932 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wrb54"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.291220 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.305811 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wrb54"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.307041 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82798150-dec0-4bf6-a917-afe9e9ad020d-operator-scripts\") pod \"cinder-3502-account-create-update-m52l9\" (UID: \"82798150-dec0-4bf6-a917-afe9e9ad020d\") " pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.307172 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncwbs\" (UniqueName: \"kubernetes.io/projected/82798150-dec0-4bf6-a917-afe9e9ad020d-kube-api-access-ncwbs\") pod \"cinder-3502-account-create-update-m52l9\" (UID: \"82798150-dec0-4bf6-a917-afe9e9ad020d\") " pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.308179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82798150-dec0-4bf6-a917-afe9e9ad020d-operator-scripts\") pod \"cinder-3502-account-create-update-m52l9\" (UID: \"82798150-dec0-4bf6-a917-afe9e9ad020d\") " pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.323641 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-79e6-account-create-update-vkr56"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.324830 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.335043 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.335844 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-79e6-account-create-update-vkr56"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.336621 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncwbs\" (UniqueName: \"kubernetes.io/projected/82798150-dec0-4bf6-a917-afe9e9ad020d-kube-api-access-ncwbs\") pod \"cinder-3502-account-create-update-m52l9\" (UID: \"82798150-dec0-4bf6-a917-afe9e9ad020d\") " pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.339304 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-968pz" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.352004 5008 generic.go:334] "Generic (PLEG): container finished" podID="68176642-ba72-42e3-8493-57b042ee7ac8" containerID="4f493c0654ac13a7e1955bddbb740f2d66692da8d367b003d1337865c6e92adf" exitCode=0 Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.352088 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj-config-knp7f" event={"ID":"68176642-ba72-42e3-8493-57b042ee7ac8","Type":"ContainerDied","Data":"4f493c0654ac13a7e1955bddbb740f2d66692da8d367b003d1337865c6e92adf"} Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.392315 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"e00afaaa564c37f366ee4ac26eb8ca94d2c1e8b26ed42d7509ff378a29f8f96a"} Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.392368 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"16ca62d6ed1f662b6bd0ea0c5af9755fe9a957be9453f4224460900b731f6943"} Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.392382 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"380cc5591873123d91f18022fd060b0a9e10c5e3b072ae816f61a2e6ad015a78"} Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.392401 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerStarted","Data":"ee0fd9858e770e37fee73845e8c0a241a341746edd5488d756144d0dbce6ee7b"} Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.411350 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-operator-scripts\") pod \"barbican-79e6-account-create-update-vkr56\" (UID: \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\") " pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.411439 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj9t7\" (UniqueName: \"kubernetes.io/projected/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-kube-api-access-mj9t7\") pod \"barbican-79e6-account-create-update-vkr56\" (UID: \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\") " pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.411538 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-operator-scripts\") pod \"barbican-db-create-wrb54\" (UID: \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\") " pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.411656 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmfp8\" (UniqueName: \"kubernetes.io/projected/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-kube-api-access-xmfp8\") pod \"barbican-db-create-wrb54\" (UID: \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\") " pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.435635 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.827784148 podStartE2EDuration="35.435619307s" podCreationTimestamp="2026-03-18 18:22:25 +0000 UTC" firstStartedPulling="2026-03-18 18:22:44.880162895 +0000 UTC m=+1221.399635974" lastFinishedPulling="2026-03-18 18:22:58.487998054 +0000 UTC m=+1235.007471133" observedRunningTime="2026-03-18 18:23:00.434255713 +0000 UTC m=+1236.953728792" watchObservedRunningTime="2026-03-18 18:23:00.435619307 +0000 UTC m=+1236.955092386" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.501137 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-lllsm"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.507799 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.508149 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.514454 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-operator-scripts\") pod \"barbican-db-create-wrb54\" (UID: \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\") " pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.514604 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmfp8\" (UniqueName: \"kubernetes.io/projected/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-kube-api-access-xmfp8\") pod \"barbican-db-create-wrb54\" (UID: \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\") " pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.514890 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-operator-scripts\") pod \"barbican-79e6-account-create-update-vkr56\" (UID: \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\") " pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.514923 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj9t7\" (UniqueName: \"kubernetes.io/projected/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-kube-api-access-mj9t7\") pod \"barbican-79e6-account-create-update-vkr56\" (UID: \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\") " pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.516292 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-operator-scripts\") pod \"barbican-db-create-wrb54\" (UID: \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\") " pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.516376 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-operator-scripts\") pod \"barbican-79e6-account-create-update-vkr56\" (UID: \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\") " pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.528106 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lllsm"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.539431 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9abb-account-create-update-sp975"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.540365 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.545973 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.565994 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmfp8\" (UniqueName: \"kubernetes.io/projected/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-kube-api-access-xmfp8\") pod \"barbican-db-create-wrb54\" (UID: \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\") " pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.570751 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9abb-account-create-update-sp975"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.575274 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj9t7\" (UniqueName: \"kubernetes.io/projected/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-kube-api-access-mj9t7\") pod \"barbican-79e6-account-create-update-vkr56\" (UID: \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\") " pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.611933 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.617663 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vs2sl"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.619336 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.622456 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-87mjc" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.627907 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.630025 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.630692 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.653642 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vs2sl"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.662945 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.722776 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfnvl\" (UniqueName: \"kubernetes.io/projected/9ca61fbb-1714-4b6c-aca8-547813e7d581-kube-api-access-tfnvl\") pod \"keystone-db-sync-vs2sl\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.722867 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6030a43a-5893-4e6d-919a-c6c0ca0832a8-operator-scripts\") pod \"neutron-db-create-lllsm\" (UID: \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\") " pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.722891 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdn97\" (UniqueName: \"kubernetes.io/projected/6030a43a-5893-4e6d-919a-c6c0ca0832a8-kube-api-access-hdn97\") pod \"neutron-db-create-lllsm\" (UID: \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\") " pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.722917 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8221015a-61ac-474e-97ea-be2c233e4139-operator-scripts\") pod \"neutron-9abb-account-create-update-sp975\" (UID: \"8221015a-61ac-474e-97ea-be2c233e4139\") " pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.722974 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-combined-ca-bundle\") pod \"keystone-db-sync-vs2sl\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.722995 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k2pc\" (UniqueName: \"kubernetes.io/projected/8221015a-61ac-474e-97ea-be2c233e4139-kube-api-access-5k2pc\") pod \"neutron-9abb-account-create-update-sp975\" (UID: \"8221015a-61ac-474e-97ea-be2c233e4139\") " pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.723016 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-config-data\") pod \"keystone-db-sync-vs2sl\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.766703 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.824398 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6030a43a-5893-4e6d-919a-c6c0ca0832a8-operator-scripts\") pod \"neutron-db-create-lllsm\" (UID: \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\") " pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.824433 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdn97\" (UniqueName: \"kubernetes.io/projected/6030a43a-5893-4e6d-919a-c6c0ca0832a8-kube-api-access-hdn97\") pod \"neutron-db-create-lllsm\" (UID: \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\") " pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.824472 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8221015a-61ac-474e-97ea-be2c233e4139-operator-scripts\") pod \"neutron-9abb-account-create-update-sp975\" (UID: \"8221015a-61ac-474e-97ea-be2c233e4139\") " pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.824538 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-combined-ca-bundle\") pod \"keystone-db-sync-vs2sl\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.824571 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k2pc\" (UniqueName: \"kubernetes.io/projected/8221015a-61ac-474e-97ea-be2c233e4139-kube-api-access-5k2pc\") pod \"neutron-9abb-account-create-update-sp975\" (UID: \"8221015a-61ac-474e-97ea-be2c233e4139\") " pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.824590 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-config-data\") pod \"keystone-db-sync-vs2sl\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.824613 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfnvl\" (UniqueName: \"kubernetes.io/projected/9ca61fbb-1714-4b6c-aca8-547813e7d581-kube-api-access-tfnvl\") pod \"keystone-db-sync-vs2sl\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.825614 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8221015a-61ac-474e-97ea-be2c233e4139-operator-scripts\") pod \"neutron-9abb-account-create-update-sp975\" (UID: \"8221015a-61ac-474e-97ea-be2c233e4139\") " pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.826076 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6030a43a-5893-4e6d-919a-c6c0ca0832a8-operator-scripts\") pod \"neutron-db-create-lllsm\" (UID: \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\") " pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.834462 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-config-data\") pod \"keystone-db-sync-vs2sl\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.838042 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-l4tzz"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.838200 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-combined-ca-bundle\") pod \"keystone-db-sync-vs2sl\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.839280 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.847070 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.855007 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k2pc\" (UniqueName: \"kubernetes.io/projected/8221015a-61ac-474e-97ea-be2c233e4139-kube-api-access-5k2pc\") pod \"neutron-9abb-account-create-update-sp975\" (UID: \"8221015a-61ac-474e-97ea-be2c233e4139\") " pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.858009 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdn97\" (UniqueName: \"kubernetes.io/projected/6030a43a-5893-4e6d-919a-c6c0ca0832a8-kube-api-access-hdn97\") pod \"neutron-db-create-lllsm\" (UID: \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\") " pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.876320 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfnvl\" (UniqueName: \"kubernetes.io/projected/9ca61fbb-1714-4b6c-aca8-547813e7d581-kube-api-access-tfnvl\") pod \"keystone-db-sync-vs2sl\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.929039 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-l4tzz"] Mar 18 18:23:00 crc kubenswrapper[5008]: I0318 18:23:00.940971 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.027344 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-nb\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.027382 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-swift-storage-0\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.027431 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-svc\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.027453 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5dxb\" (UniqueName: \"kubernetes.io/projected/31f1fbe4-9cb2-41cd-938b-040adc62c26f-kube-api-access-n5dxb\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.027471 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-sb\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.027494 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-config\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.032178 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-968pz"] Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.049293 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.129299 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-nb\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.129670 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-swift-storage-0\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.129743 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-svc\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.129766 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5dxb\" (UniqueName: \"kubernetes.io/projected/31f1fbe4-9cb2-41cd-938b-040adc62c26f-kube-api-access-n5dxb\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.129808 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-sb\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.130183 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-config\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.130557 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-nb\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.130932 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-svc\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.131052 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-sb\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.131172 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-config\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.132268 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-swift-storage-0\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.146974 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.153672 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5dxb\" (UniqueName: \"kubernetes.io/projected/31f1fbe4-9cb2-41cd-938b-040adc62c26f-kube-api-access-n5dxb\") pod \"dnsmasq-dns-cb65b4b5-l4tzz\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.165228 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.246122 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3502-account-create-update-m52l9"] Mar 18 18:23:01 crc kubenswrapper[5008]: W0318 18:23:01.248873 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82798150_dec0_4bf6_a917_afe9e9ad020d.slice/crio-d5649d0a2b5cf9bc3d99cc517163eba14ea0e813ccd99e7a6ad5eb02ff27fb52 WatchSource:0}: Error finding container d5649d0a2b5cf9bc3d99cc517163eba14ea0e813ccd99e7a6ad5eb02ff27fb52: Status 404 returned error can't find the container with id d5649d0a2b5cf9bc3d99cc517163eba14ea0e813ccd99e7a6ad5eb02ff27fb52 Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.364663 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9abb-account-create-update-sp975"] Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.422544 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-968pz" event={"ID":"ba33af3d-82b3-406f-9b9d-9511c8e874c1","Type":"ContainerStarted","Data":"38c35231cb4f965f24f273cdc08bc89c2aee65ecd48f5c1502dae6903a1aea69"} Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.422593 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-968pz" event={"ID":"ba33af3d-82b3-406f-9b9d-9511c8e874c1","Type":"ContainerStarted","Data":"2f5eb1e2d9eca1dcb87762b817398d0ffbfa1b752885fd9fa5cbbd18b621f5a7"} Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.429611 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3502-account-create-update-m52l9" event={"ID":"82798150-dec0-4bf6-a917-afe9e9ad020d","Type":"ContainerStarted","Data":"d5649d0a2b5cf9bc3d99cc517163eba14ea0e813ccd99e7a6ad5eb02ff27fb52"} Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.468356 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wrb54"] Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.494990 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-79e6-account-create-update-vkr56"] Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.542356 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-968pz" podStartSLOduration=2.54233776 podStartE2EDuration="2.54233776s" podCreationTimestamp="2026-03-18 18:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:01.456181347 +0000 UTC m=+1237.975654456" watchObservedRunningTime="2026-03-18 18:23:01.54233776 +0000 UTC m=+1238.061810839" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.637887 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vs2sl"] Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.875685 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lllsm"] Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.878206 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.967938 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-log-ovn\") pod \"68176642-ba72-42e3-8493-57b042ee7ac8\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.968042 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4jzc\" (UniqueName: \"kubernetes.io/projected/68176642-ba72-42e3-8493-57b042ee7ac8-kube-api-access-h4jzc\") pod \"68176642-ba72-42e3-8493-57b042ee7ac8\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.968060 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "68176642-ba72-42e3-8493-57b042ee7ac8" (UID: "68176642-ba72-42e3-8493-57b042ee7ac8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.968104 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-scripts\") pod \"68176642-ba72-42e3-8493-57b042ee7ac8\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.968134 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run\") pod \"68176642-ba72-42e3-8493-57b042ee7ac8\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.968151 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-additional-scripts\") pod \"68176642-ba72-42e3-8493-57b042ee7ac8\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.968236 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run-ovn\") pod \"68176642-ba72-42e3-8493-57b042ee7ac8\" (UID: \"68176642-ba72-42e3-8493-57b042ee7ac8\") " Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.968316 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run" (OuterVolumeSpecName: "var-run") pod "68176642-ba72-42e3-8493-57b042ee7ac8" (UID: "68176642-ba72-42e3-8493-57b042ee7ac8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.968415 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "68176642-ba72-42e3-8493-57b042ee7ac8" (UID: "68176642-ba72-42e3-8493-57b042ee7ac8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.968857 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "68176642-ba72-42e3-8493-57b042ee7ac8" (UID: "68176642-ba72-42e3-8493-57b042ee7ac8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.969090 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-scripts" (OuterVolumeSpecName: "scripts") pod "68176642-ba72-42e3-8493-57b042ee7ac8" (UID: "68176642-ba72-42e3-8493-57b042ee7ac8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.969435 5008 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.969450 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.969459 5008 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.969467 5008 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68176642-ba72-42e3-8493-57b042ee7ac8-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.969476 5008 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68176642-ba72-42e3-8493-57b042ee7ac8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:01 crc kubenswrapper[5008]: I0318 18:23:01.973891 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68176642-ba72-42e3-8493-57b042ee7ac8-kube-api-access-h4jzc" (OuterVolumeSpecName: "kube-api-access-h4jzc") pod "68176642-ba72-42e3-8493-57b042ee7ac8" (UID: "68176642-ba72-42e3-8493-57b042ee7ac8"). InnerVolumeSpecName "kube-api-access-h4jzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.002954 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-l4tzz"] Mar 18 18:23:02 crc kubenswrapper[5008]: W0318 18:23:02.007002 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1fbe4_9cb2_41cd_938b_040adc62c26f.slice/crio-7f49a1e4b575632a33060bfb1c335d85f99380982bde2b006ce8aad46fcf858c WatchSource:0}: Error finding container 7f49a1e4b575632a33060bfb1c335d85f99380982bde2b006ce8aad46fcf858c: Status 404 returned error can't find the container with id 7f49a1e4b575632a33060bfb1c335d85f99380982bde2b006ce8aad46fcf858c Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.071397 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4jzc\" (UniqueName: \"kubernetes.io/projected/68176642-ba72-42e3-8493-57b042ee7ac8-kube-api-access-h4jzc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.439291 5008 generic.go:334] "Generic (PLEG): container finished" podID="82798150-dec0-4bf6-a917-afe9e9ad020d" containerID="f3b95184a7984c98dbe7c421b1a76f5828bb070cd349c4211c466713abbb44bd" exitCode=0 Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.439394 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3502-account-create-update-m52l9" event={"ID":"82798150-dec0-4bf6-a917-afe9e9ad020d","Type":"ContainerDied","Data":"f3b95184a7984c98dbe7c421b1a76f5828bb070cd349c4211c466713abbb44bd"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.441127 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vs2sl" event={"ID":"9ca61fbb-1714-4b6c-aca8-547813e7d581","Type":"ContainerStarted","Data":"5f234154bbd4fa8c98daa18ceba5caf533b8671ba5688f0bc0f09ea53301f84a"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.443123 5008 generic.go:334] "Generic (PLEG): container finished" podID="acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b" containerID="47effa833330afdff41754f56f12e67efb2dfaec92e6efa219903a6b7c1fe169" exitCode=0 Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.443188 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wrb54" event={"ID":"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b","Type":"ContainerDied","Data":"47effa833330afdff41754f56f12e67efb2dfaec92e6efa219903a6b7c1fe169"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.443205 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wrb54" event={"ID":"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b","Type":"ContainerStarted","Data":"5cc505d36da0c0c7dd9314885d99b1d831a99c818865e59479b1ab6b589b9147"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.444708 5008 generic.go:334] "Generic (PLEG): container finished" podID="6030a43a-5893-4e6d-919a-c6c0ca0832a8" containerID="c5ebbf7fdaca2200cb1d9823b9896d5673972b9ae440af8084db85d94e9e1a85" exitCode=0 Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.444822 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lllsm" event={"ID":"6030a43a-5893-4e6d-919a-c6c0ca0832a8","Type":"ContainerDied","Data":"c5ebbf7fdaca2200cb1d9823b9896d5673972b9ae440af8084db85d94e9e1a85"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.444935 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lllsm" event={"ID":"6030a43a-5893-4e6d-919a-c6c0ca0832a8","Type":"ContainerStarted","Data":"ebaadfea603d340c0edb47fd62db98c3952c4a13cfb124474a998aad175f9e2b"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.446988 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj-config-knp7f" Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.447222 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj-config-knp7f" event={"ID":"68176642-ba72-42e3-8493-57b042ee7ac8","Type":"ContainerDied","Data":"39c824b90efaad7d881f7adb4b53267bfd4db9871d104ae0ca429fecb399e086"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.447245 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39c824b90efaad7d881f7adb4b53267bfd4db9871d104ae0ca429fecb399e086" Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.449804 5008 generic.go:334] "Generic (PLEG): container finished" podID="8221015a-61ac-474e-97ea-be2c233e4139" containerID="b2bee2c950af6264c90593b94283891f75fe0f285363f33017f08c83c4819718" exitCode=0 Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.449870 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9abb-account-create-update-sp975" event={"ID":"8221015a-61ac-474e-97ea-be2c233e4139","Type":"ContainerDied","Data":"b2bee2c950af6264c90593b94283891f75fe0f285363f33017f08c83c4819718"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.449900 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9abb-account-create-update-sp975" event={"ID":"8221015a-61ac-474e-97ea-be2c233e4139","Type":"ContainerStarted","Data":"0e956da295b50cf6425a0d7b5489a2a0dbe11bfcfb04c36c4b3d3aefc1932aea"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.451505 5008 generic.go:334] "Generic (PLEG): container finished" podID="31f1fbe4-9cb2-41cd-938b-040adc62c26f" containerID="99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59" exitCode=0 Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.451580 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" event={"ID":"31f1fbe4-9cb2-41cd-938b-040adc62c26f","Type":"ContainerDied","Data":"99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.451602 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" event={"ID":"31f1fbe4-9cb2-41cd-938b-040adc62c26f","Type":"ContainerStarted","Data":"7f49a1e4b575632a33060bfb1c335d85f99380982bde2b006ce8aad46fcf858c"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.456508 5008 generic.go:334] "Generic (PLEG): container finished" podID="ac3fb1a3-21e9-4a67-8c15-e0e12424a5df" containerID="b170adfe134c9b486f05f7ade0111e4b3038c407224af9fab38d1f5b42aacb25" exitCode=0 Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.456600 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-79e6-account-create-update-vkr56" event={"ID":"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df","Type":"ContainerDied","Data":"b170adfe134c9b486f05f7ade0111e4b3038c407224af9fab38d1f5b42aacb25"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.456664 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-79e6-account-create-update-vkr56" event={"ID":"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df","Type":"ContainerStarted","Data":"c8eb74e1621aed87c74384fb20a68535279f31a193d1f4b70c1292ce01d5b508"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.459021 5008 generic.go:334] "Generic (PLEG): container finished" podID="ba33af3d-82b3-406f-9b9d-9511c8e874c1" containerID="38c35231cb4f965f24f273cdc08bc89c2aee65ecd48f5c1502dae6903a1aea69" exitCode=0 Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.459085 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-968pz" event={"ID":"ba33af3d-82b3-406f-9b9d-9511c8e874c1","Type":"ContainerDied","Data":"38c35231cb4f965f24f273cdc08bc89c2aee65ecd48f5c1502dae6903a1aea69"} Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.956985 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9qcqj-config-knp7f"] Mar 18 18:23:02 crc kubenswrapper[5008]: I0318 18:23:02.968574 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9qcqj-config-knp7f"] Mar 18 18:23:03 crc kubenswrapper[5008]: I0318 18:23:03.468653 5008 generic.go:334] "Generic (PLEG): container finished" podID="1558ccb4-f7d0-4b6d-a458-b13cb927f6b3" containerID="301939a383694f0855d653da76c5187011b8a736256c7435c14a333754982b7e" exitCode=0 Mar 18 18:23:03 crc kubenswrapper[5008]: I0318 18:23:03.468747 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jqd2z" event={"ID":"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3","Type":"ContainerDied","Data":"301939a383694f0855d653da76c5187011b8a736256c7435c14a333754982b7e"} Mar 18 18:23:03 crc kubenswrapper[5008]: I0318 18:23:03.473027 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" event={"ID":"31f1fbe4-9cb2-41cd-938b-040adc62c26f","Type":"ContainerStarted","Data":"787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878"} Mar 18 18:23:03 crc kubenswrapper[5008]: I0318 18:23:03.527805 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" podStartSLOduration=3.527789489 podStartE2EDuration="3.527789489s" podCreationTimestamp="2026-03-18 18:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:03.526064556 +0000 UTC m=+1240.045537635" watchObservedRunningTime="2026-03-18 18:23:03.527789489 +0000 UTC m=+1240.047262568" Mar 18 18:23:03 crc kubenswrapper[5008]: I0318 18:23:03.888566 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.017109 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.019754 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8221015a-61ac-474e-97ea-be2c233e4139-operator-scripts\") pod \"8221015a-61ac-474e-97ea-be2c233e4139\" (UID: \"8221015a-61ac-474e-97ea-be2c233e4139\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.019850 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k2pc\" (UniqueName: \"kubernetes.io/projected/8221015a-61ac-474e-97ea-be2c233e4139-kube-api-access-5k2pc\") pod \"8221015a-61ac-474e-97ea-be2c233e4139\" (UID: \"8221015a-61ac-474e-97ea-be2c233e4139\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.020433 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8221015a-61ac-474e-97ea-be2c233e4139-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8221015a-61ac-474e-97ea-be2c233e4139" (UID: "8221015a-61ac-474e-97ea-be2c233e4139"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.020708 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8221015a-61ac-474e-97ea-be2c233e4139-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.032676 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8221015a-61ac-474e-97ea-be2c233e4139-kube-api-access-5k2pc" (OuterVolumeSpecName: "kube-api-access-5k2pc") pod "8221015a-61ac-474e-97ea-be2c233e4139" (UID: "8221015a-61ac-474e-97ea-be2c233e4139"). InnerVolumeSpecName "kube-api-access-5k2pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.048443 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.073792 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.080841 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-968pz" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.091353 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.121755 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdn97\" (UniqueName: \"kubernetes.io/projected/6030a43a-5893-4e6d-919a-c6c0ca0832a8-kube-api-access-hdn97\") pod \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\" (UID: \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.121881 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6030a43a-5893-4e6d-919a-c6c0ca0832a8-operator-scripts\") pod \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\" (UID: \"6030a43a-5893-4e6d-919a-c6c0ca0832a8\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.122221 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k2pc\" (UniqueName: \"kubernetes.io/projected/8221015a-61ac-474e-97ea-be2c233e4139-kube-api-access-5k2pc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.122360 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6030a43a-5893-4e6d-919a-c6c0ca0832a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6030a43a-5893-4e6d-919a-c6c0ca0832a8" (UID: "6030a43a-5893-4e6d-919a-c6c0ca0832a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.124631 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6030a43a-5893-4e6d-919a-c6c0ca0832a8-kube-api-access-hdn97" (OuterVolumeSpecName: "kube-api-access-hdn97") pod "6030a43a-5893-4e6d-919a-c6c0ca0832a8" (UID: "6030a43a-5893-4e6d-919a-c6c0ca0832a8"). InnerVolumeSpecName "kube-api-access-hdn97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.209217 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68176642-ba72-42e3-8493-57b042ee7ac8" path="/var/lib/kubelet/pods/68176642-ba72-42e3-8493-57b042ee7ac8/volumes" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.223015 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmfp8\" (UniqueName: \"kubernetes.io/projected/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-kube-api-access-xmfp8\") pod \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\" (UID: \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.223092 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj9t7\" (UniqueName: \"kubernetes.io/projected/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-kube-api-access-mj9t7\") pod \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\" (UID: \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.223147 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-operator-scripts\") pod \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\" (UID: \"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.223171 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba33af3d-82b3-406f-9b9d-9511c8e874c1-operator-scripts\") pod \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\" (UID: \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.223924 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b" (UID: "acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.224019 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-operator-scripts\") pod \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\" (UID: \"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.224018 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba33af3d-82b3-406f-9b9d-9511c8e874c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba33af3d-82b3-406f-9b9d-9511c8e874c1" (UID: "ba33af3d-82b3-406f-9b9d-9511c8e874c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.224096 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82798150-dec0-4bf6-a917-afe9e9ad020d-operator-scripts\") pod \"82798150-dec0-4bf6-a917-afe9e9ad020d\" (UID: \"82798150-dec0-4bf6-a917-afe9e9ad020d\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.224458 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac3fb1a3-21e9-4a67-8c15-e0e12424a5df" (UID: "ac3fb1a3-21e9-4a67-8c15-e0e12424a5df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.224598 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82798150-dec0-4bf6-a917-afe9e9ad020d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82798150-dec0-4bf6-a917-afe9e9ad020d" (UID: "82798150-dec0-4bf6-a917-afe9e9ad020d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.225098 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkgtl\" (UniqueName: \"kubernetes.io/projected/ba33af3d-82b3-406f-9b9d-9511c8e874c1-kube-api-access-xkgtl\") pod \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\" (UID: \"ba33af3d-82b3-406f-9b9d-9511c8e874c1\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.225131 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncwbs\" (UniqueName: \"kubernetes.io/projected/82798150-dec0-4bf6-a917-afe9e9ad020d-kube-api-access-ncwbs\") pod \"82798150-dec0-4bf6-a917-afe9e9ad020d\" (UID: \"82798150-dec0-4bf6-a917-afe9e9ad020d\") " Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.225370 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.225381 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba33af3d-82b3-406f-9b9d-9511c8e874c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.225390 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.225398 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdn97\" (UniqueName: \"kubernetes.io/projected/6030a43a-5893-4e6d-919a-c6c0ca0832a8-kube-api-access-hdn97\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.225407 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82798150-dec0-4bf6-a917-afe9e9ad020d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.225416 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6030a43a-5893-4e6d-919a-c6c0ca0832a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.227035 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-kube-api-access-xmfp8" (OuterVolumeSpecName: "kube-api-access-xmfp8") pod "acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b" (UID: "acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b"). InnerVolumeSpecName "kube-api-access-xmfp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.227130 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-kube-api-access-mj9t7" (OuterVolumeSpecName: "kube-api-access-mj9t7") pod "ac3fb1a3-21e9-4a67-8c15-e0e12424a5df" (UID: "ac3fb1a3-21e9-4a67-8c15-e0e12424a5df"). InnerVolumeSpecName "kube-api-access-mj9t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.229248 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba33af3d-82b3-406f-9b9d-9511c8e874c1-kube-api-access-xkgtl" (OuterVolumeSpecName: "kube-api-access-xkgtl") pod "ba33af3d-82b3-406f-9b9d-9511c8e874c1" (UID: "ba33af3d-82b3-406f-9b9d-9511c8e874c1"). InnerVolumeSpecName "kube-api-access-xkgtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.230480 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82798150-dec0-4bf6-a917-afe9e9ad020d-kube-api-access-ncwbs" (OuterVolumeSpecName: "kube-api-access-ncwbs") pod "82798150-dec0-4bf6-a917-afe9e9ad020d" (UID: "82798150-dec0-4bf6-a917-afe9e9ad020d"). InnerVolumeSpecName "kube-api-access-ncwbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.330096 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj9t7\" (UniqueName: \"kubernetes.io/projected/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df-kube-api-access-mj9t7\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.330144 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkgtl\" (UniqueName: \"kubernetes.io/projected/ba33af3d-82b3-406f-9b9d-9511c8e874c1-kube-api-access-xkgtl\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.330162 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncwbs\" (UniqueName: \"kubernetes.io/projected/82798150-dec0-4bf6-a917-afe9e9ad020d-kube-api-access-ncwbs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.330176 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmfp8\" (UniqueName: \"kubernetes.io/projected/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b-kube-api-access-xmfp8\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.483048 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lllsm" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.483043 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lllsm" event={"ID":"6030a43a-5893-4e6d-919a-c6c0ca0832a8","Type":"ContainerDied","Data":"ebaadfea603d340c0edb47fd62db98c3952c4a13cfb124474a998aad175f9e2b"} Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.483096 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebaadfea603d340c0edb47fd62db98c3952c4a13cfb124474a998aad175f9e2b" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.485117 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wrb54" event={"ID":"acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b","Type":"ContainerDied","Data":"5cc505d36da0c0c7dd9314885d99b1d831a99c818865e59479b1ab6b589b9147"} Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.485132 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wrb54" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.485138 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cc505d36da0c0c7dd9314885d99b1d831a99c818865e59479b1ab6b589b9147" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.487127 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9abb-account-create-update-sp975" event={"ID":"8221015a-61ac-474e-97ea-be2c233e4139","Type":"ContainerDied","Data":"0e956da295b50cf6425a0d7b5489a2a0dbe11bfcfb04c36c4b3d3aefc1932aea"} Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.487150 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e956da295b50cf6425a0d7b5489a2a0dbe11bfcfb04c36c4b3d3aefc1932aea" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.487198 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9abb-account-create-update-sp975" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.490645 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-79e6-account-create-update-vkr56" event={"ID":"ac3fb1a3-21e9-4a67-8c15-e0e12424a5df","Type":"ContainerDied","Data":"c8eb74e1621aed87c74384fb20a68535279f31a193d1f4b70c1292ce01d5b508"} Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.490690 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8eb74e1621aed87c74384fb20a68535279f31a193d1f4b70c1292ce01d5b508" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.490747 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-79e6-account-create-update-vkr56" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.493995 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-968pz" event={"ID":"ba33af3d-82b3-406f-9b9d-9511c8e874c1","Type":"ContainerDied","Data":"2f5eb1e2d9eca1dcb87762b817398d0ffbfa1b752885fd9fa5cbbd18b621f5a7"} Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.494036 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f5eb1e2d9eca1dcb87762b817398d0ffbfa1b752885fd9fa5cbbd18b621f5a7" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.494095 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-968pz" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.508393 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3502-account-create-update-m52l9" event={"ID":"82798150-dec0-4bf6-a917-afe9e9ad020d","Type":"ContainerDied","Data":"d5649d0a2b5cf9bc3d99cc517163eba14ea0e813ccd99e7a6ad5eb02ff27fb52"} Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.508442 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5649d0a2b5cf9bc3d99cc517163eba14ea0e813ccd99e7a6ad5eb02ff27fb52" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.508407 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3502-account-create-update-m52l9" Mar 18 18:23:04 crc kubenswrapper[5008]: I0318 18:23:04.508944 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.082144 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jqd2z" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.180060 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-combined-ca-bundle\") pod \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.180153 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-db-sync-config-data\") pod \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.180253 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-config-data\") pod \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.180281 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65qdw\" (UniqueName: \"kubernetes.io/projected/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-kube-api-access-65qdw\") pod \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\" (UID: \"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3\") " Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.202844 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1558ccb4-f7d0-4b6d-a458-b13cb927f6b3" (UID: "1558ccb4-f7d0-4b6d-a458-b13cb927f6b3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.203079 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-kube-api-access-65qdw" (OuterVolumeSpecName: "kube-api-access-65qdw") pod "1558ccb4-f7d0-4b6d-a458-b13cb927f6b3" (UID: "1558ccb4-f7d0-4b6d-a458-b13cb927f6b3"). InnerVolumeSpecName "kube-api-access-65qdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.221992 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1558ccb4-f7d0-4b6d-a458-b13cb927f6b3" (UID: "1558ccb4-f7d0-4b6d-a458-b13cb927f6b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.283776 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-config-data" (OuterVolumeSpecName: "config-data") pod "1558ccb4-f7d0-4b6d-a458-b13cb927f6b3" (UID: "1558ccb4-f7d0-4b6d-a458-b13cb927f6b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.289670 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.289700 5008 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.289709 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.289718 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65qdw\" (UniqueName: \"kubernetes.io/projected/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3-kube-api-access-65qdw\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.536549 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vs2sl" event={"ID":"9ca61fbb-1714-4b6c-aca8-547813e7d581","Type":"ContainerStarted","Data":"4047e4f55dd1036cd59241e4a458eb989331ec47896b35605a25dca184e1b9f0"} Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.539113 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jqd2z" event={"ID":"1558ccb4-f7d0-4b6d-a458-b13cb927f6b3","Type":"ContainerDied","Data":"cc5f2900c4b65f84bbe0433f7c5a1fe63cbd4312aec266d08b31fff0b4899c95"} Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.539138 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc5f2900c4b65f84bbe0433f7c5a1fe63cbd4312aec266d08b31fff0b4899c95" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.539368 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jqd2z" Mar 18 18:23:07 crc kubenswrapper[5008]: I0318 18:23:07.562124 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vs2sl" podStartSLOduration=2.316024052 podStartE2EDuration="7.56209914s" podCreationTimestamp="2026-03-18 18:23:00 +0000 UTC" firstStartedPulling="2026-03-18 18:23:01.686168764 +0000 UTC m=+1238.205641843" lastFinishedPulling="2026-03-18 18:23:06.932243802 +0000 UTC m=+1243.451716931" observedRunningTime="2026-03-18 18:23:07.560337236 +0000 UTC m=+1244.079810315" watchObservedRunningTime="2026-03-18 18:23:07.56209914 +0000 UTC m=+1244.081572239" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.533079 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-l4tzz"] Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.533343 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" podUID="31f1fbe4-9cb2-41cd-938b-040adc62c26f" containerName="dnsmasq-dns" containerID="cri-o://787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878" gracePeriod=10 Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.535674 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.572614 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d4f86d997-dknkr"] Mar 18 18:23:08 crc kubenswrapper[5008]: E0318 18:23:08.573083 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82798150-dec0-4bf6-a917-afe9e9ad020d" containerName="mariadb-account-create-update" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573101 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="82798150-dec0-4bf6-a917-afe9e9ad020d" containerName="mariadb-account-create-update" Mar 18 18:23:08 crc kubenswrapper[5008]: E0318 18:23:08.573123 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6030a43a-5893-4e6d-919a-c6c0ca0832a8" containerName="mariadb-database-create" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573133 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6030a43a-5893-4e6d-919a-c6c0ca0832a8" containerName="mariadb-database-create" Mar 18 18:23:08 crc kubenswrapper[5008]: E0318 18:23:08.573146 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68176642-ba72-42e3-8493-57b042ee7ac8" containerName="ovn-config" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573154 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="68176642-ba72-42e3-8493-57b042ee7ac8" containerName="ovn-config" Mar 18 18:23:08 crc kubenswrapper[5008]: E0318 18:23:08.573166 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b" containerName="mariadb-database-create" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573173 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b" containerName="mariadb-database-create" Mar 18 18:23:08 crc kubenswrapper[5008]: E0318 18:23:08.573185 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3fb1a3-21e9-4a67-8c15-e0e12424a5df" containerName="mariadb-account-create-update" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573192 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3fb1a3-21e9-4a67-8c15-e0e12424a5df" containerName="mariadb-account-create-update" Mar 18 18:23:08 crc kubenswrapper[5008]: E0318 18:23:08.573203 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1558ccb4-f7d0-4b6d-a458-b13cb927f6b3" containerName="glance-db-sync" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573210 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1558ccb4-f7d0-4b6d-a458-b13cb927f6b3" containerName="glance-db-sync" Mar 18 18:23:08 crc kubenswrapper[5008]: E0318 18:23:08.573223 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8221015a-61ac-474e-97ea-be2c233e4139" containerName="mariadb-account-create-update" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573230 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8221015a-61ac-474e-97ea-be2c233e4139" containerName="mariadb-account-create-update" Mar 18 18:23:08 crc kubenswrapper[5008]: E0318 18:23:08.573248 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba33af3d-82b3-406f-9b9d-9511c8e874c1" containerName="mariadb-database-create" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573256 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba33af3d-82b3-406f-9b9d-9511c8e874c1" containerName="mariadb-database-create" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573432 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba33af3d-82b3-406f-9b9d-9511c8e874c1" containerName="mariadb-database-create" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573450 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8221015a-61ac-474e-97ea-be2c233e4139" containerName="mariadb-account-create-update" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573461 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6030a43a-5893-4e6d-919a-c6c0ca0832a8" containerName="mariadb-database-create" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573478 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="68176642-ba72-42e3-8493-57b042ee7ac8" containerName="ovn-config" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573485 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3fb1a3-21e9-4a67-8c15-e0e12424a5df" containerName="mariadb-account-create-update" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573497 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1558ccb4-f7d0-4b6d-a458-b13cb927f6b3" containerName="glance-db-sync" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573509 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="82798150-dec0-4bf6-a917-afe9e9ad020d" containerName="mariadb-account-create-update" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.573521 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b" containerName="mariadb-database-create" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.574665 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.590013 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d4f86d997-dknkr"] Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.714409 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.714486 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-svc\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.714541 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-config\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.714578 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.714612 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5vq6\" (UniqueName: \"kubernetes.io/projected/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-kube-api-access-v5vq6\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.714659 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.815634 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-svc\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.815916 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-config\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.815947 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.815973 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5vq6\" (UniqueName: \"kubernetes.io/projected/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-kube-api-access-v5vq6\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.816022 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.816062 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.816965 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-config\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.817017 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.817157 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-svc\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.817510 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.817656 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.836614 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5vq6\" (UniqueName: \"kubernetes.io/projected/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-kube-api-access-v5vq6\") pod \"dnsmasq-dns-5d4f86d997-dknkr\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:08 crc kubenswrapper[5008]: I0318 18:23:08.918018 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.034830 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.227543 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-swift-storage-0\") pod \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.227835 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-sb\") pod \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.227866 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5dxb\" (UniqueName: \"kubernetes.io/projected/31f1fbe4-9cb2-41cd-938b-040adc62c26f-kube-api-access-n5dxb\") pod \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.227984 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-config\") pod \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.228037 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-svc\") pod \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.228096 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-nb\") pod \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\" (UID: \"31f1fbe4-9cb2-41cd-938b-040adc62c26f\") " Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.235874 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f1fbe4-9cb2-41cd-938b-040adc62c26f-kube-api-access-n5dxb" (OuterVolumeSpecName: "kube-api-access-n5dxb") pod "31f1fbe4-9cb2-41cd-938b-040adc62c26f" (UID: "31f1fbe4-9cb2-41cd-938b-040adc62c26f"). InnerVolumeSpecName "kube-api-access-n5dxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.278497 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31f1fbe4-9cb2-41cd-938b-040adc62c26f" (UID: "31f1fbe4-9cb2-41cd-938b-040adc62c26f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.286114 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "31f1fbe4-9cb2-41cd-938b-040adc62c26f" (UID: "31f1fbe4-9cb2-41cd-938b-040adc62c26f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.286745 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31f1fbe4-9cb2-41cd-938b-040adc62c26f" (UID: "31f1fbe4-9cb2-41cd-938b-040adc62c26f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.296190 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31f1fbe4-9cb2-41cd-938b-040adc62c26f" (UID: "31f1fbe4-9cb2-41cd-938b-040adc62c26f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.302230 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-config" (OuterVolumeSpecName: "config") pod "31f1fbe4-9cb2-41cd-938b-040adc62c26f" (UID: "31f1fbe4-9cb2-41cd-938b-040adc62c26f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.330295 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.330463 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.330478 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.330488 5008 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.330499 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31f1fbe4-9cb2-41cd-938b-040adc62c26f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.330507 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5dxb\" (UniqueName: \"kubernetes.io/projected/31f1fbe4-9cb2-41cd-938b-040adc62c26f-kube-api-access-n5dxb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:09 crc kubenswrapper[5008]: W0318 18:23:09.479837 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod336cb77a_c6b8_4e03_851e_42e3cfff7ea6.slice/crio-fa33a3ad789e1819ca6d16a24f28aaef1535212987cd9e899e4ead5a3cd6ed4a WatchSource:0}: Error finding container fa33a3ad789e1819ca6d16a24f28aaef1535212987cd9e899e4ead5a3cd6ed4a: Status 404 returned error can't find the container with id fa33a3ad789e1819ca6d16a24f28aaef1535212987cd9e899e4ead5a3cd6ed4a Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.479927 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d4f86d997-dknkr"] Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.576067 5008 generic.go:334] "Generic (PLEG): container finished" podID="31f1fbe4-9cb2-41cd-938b-040adc62c26f" containerID="787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878" exitCode=0 Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.576146 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" event={"ID":"31f1fbe4-9cb2-41cd-938b-040adc62c26f","Type":"ContainerDied","Data":"787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878"} Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.576161 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.576211 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb65b4b5-l4tzz" event={"ID":"31f1fbe4-9cb2-41cd-938b-040adc62c26f","Type":"ContainerDied","Data":"7f49a1e4b575632a33060bfb1c335d85f99380982bde2b006ce8aad46fcf858c"} Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.576243 5008 scope.go:117] "RemoveContainer" containerID="787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.580126 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" event={"ID":"336cb77a-c6b8-4e03-851e-42e3cfff7ea6","Type":"ContainerStarted","Data":"fa33a3ad789e1819ca6d16a24f28aaef1535212987cd9e899e4ead5a3cd6ed4a"} Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.612769 5008 scope.go:117] "RemoveContainer" containerID="99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.633716 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-l4tzz"] Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.642601 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb65b4b5-l4tzz"] Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.645103 5008 scope.go:117] "RemoveContainer" containerID="787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878" Mar 18 18:23:09 crc kubenswrapper[5008]: E0318 18:23:09.645614 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878\": container with ID starting with 787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878 not found: ID does not exist" containerID="787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.645644 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878"} err="failed to get container status \"787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878\": rpc error: code = NotFound desc = could not find container \"787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878\": container with ID starting with 787b44f6a7b8b25be27ed10d87328e8df18dd7a70ac13e771cd0f27a5371a878 not found: ID does not exist" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.645664 5008 scope.go:117] "RemoveContainer" containerID="99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59" Mar 18 18:23:09 crc kubenswrapper[5008]: E0318 18:23:09.645907 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59\": container with ID starting with 99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59 not found: ID does not exist" containerID="99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59" Mar 18 18:23:09 crc kubenswrapper[5008]: I0318 18:23:09.645927 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59"} err="failed to get container status \"99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59\": rpc error: code = NotFound desc = could not find container \"99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59\": container with ID starting with 99b07ff0c3b074b2c4771b3895d7549538b2266d65d8046b217b271df5dfcb59 not found: ID does not exist" Mar 18 18:23:10 crc kubenswrapper[5008]: I0318 18:23:10.213408 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31f1fbe4-9cb2-41cd-938b-040adc62c26f" path="/var/lib/kubelet/pods/31f1fbe4-9cb2-41cd-938b-040adc62c26f/volumes" Mar 18 18:23:10 crc kubenswrapper[5008]: I0318 18:23:10.593543 5008 generic.go:334] "Generic (PLEG): container finished" podID="9ca61fbb-1714-4b6c-aca8-547813e7d581" containerID="4047e4f55dd1036cd59241e4a458eb989331ec47896b35605a25dca184e1b9f0" exitCode=0 Mar 18 18:23:10 crc kubenswrapper[5008]: I0318 18:23:10.593687 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vs2sl" event={"ID":"9ca61fbb-1714-4b6c-aca8-547813e7d581","Type":"ContainerDied","Data":"4047e4f55dd1036cd59241e4a458eb989331ec47896b35605a25dca184e1b9f0"} Mar 18 18:23:10 crc kubenswrapper[5008]: I0318 18:23:10.596506 5008 generic.go:334] "Generic (PLEG): container finished" podID="336cb77a-c6b8-4e03-851e-42e3cfff7ea6" containerID="ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0" exitCode=0 Mar 18 18:23:10 crc kubenswrapper[5008]: I0318 18:23:10.596548 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" event={"ID":"336cb77a-c6b8-4e03-851e-42e3cfff7ea6","Type":"ContainerDied","Data":"ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0"} Mar 18 18:23:11 crc kubenswrapper[5008]: I0318 18:23:11.605778 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" event={"ID":"336cb77a-c6b8-4e03-851e-42e3cfff7ea6","Type":"ContainerStarted","Data":"7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e"} Mar 18 18:23:11 crc kubenswrapper[5008]: I0318 18:23:11.606402 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:11 crc kubenswrapper[5008]: I0318 18:23:11.626485 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" podStartSLOduration=3.626459123 podStartE2EDuration="3.626459123s" podCreationTimestamp="2026-03-18 18:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:11.62233053 +0000 UTC m=+1248.141803619" watchObservedRunningTime="2026-03-18 18:23:11.626459123 +0000 UTC m=+1248.145932212" Mar 18 18:23:11 crc kubenswrapper[5008]: I0318 18:23:11.950187 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.076951 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-config-data\") pod \"9ca61fbb-1714-4b6c-aca8-547813e7d581\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.077022 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-combined-ca-bundle\") pod \"9ca61fbb-1714-4b6c-aca8-547813e7d581\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.077111 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfnvl\" (UniqueName: \"kubernetes.io/projected/9ca61fbb-1714-4b6c-aca8-547813e7d581-kube-api-access-tfnvl\") pod \"9ca61fbb-1714-4b6c-aca8-547813e7d581\" (UID: \"9ca61fbb-1714-4b6c-aca8-547813e7d581\") " Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.082980 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca61fbb-1714-4b6c-aca8-547813e7d581-kube-api-access-tfnvl" (OuterVolumeSpecName: "kube-api-access-tfnvl") pod "9ca61fbb-1714-4b6c-aca8-547813e7d581" (UID: "9ca61fbb-1714-4b6c-aca8-547813e7d581"). InnerVolumeSpecName "kube-api-access-tfnvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.103470 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ca61fbb-1714-4b6c-aca8-547813e7d581" (UID: "9ca61fbb-1714-4b6c-aca8-547813e7d581"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.118707 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-config-data" (OuterVolumeSpecName: "config-data") pod "9ca61fbb-1714-4b6c-aca8-547813e7d581" (UID: "9ca61fbb-1714-4b6c-aca8-547813e7d581"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.178431 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.178463 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca61fbb-1714-4b6c-aca8-547813e7d581-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.178474 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfnvl\" (UniqueName: \"kubernetes.io/projected/9ca61fbb-1714-4b6c-aca8-547813e7d581-kube-api-access-tfnvl\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.620804 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vs2sl" event={"ID":"9ca61fbb-1714-4b6c-aca8-547813e7d581","Type":"ContainerDied","Data":"5f234154bbd4fa8c98daa18ceba5caf533b8671ba5688f0bc0f09ea53301f84a"} Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.620888 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f234154bbd4fa8c98daa18ceba5caf533b8671ba5688f0bc0f09ea53301f84a" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.620836 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vs2sl" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.909883 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d4f86d997-dknkr"] Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.927635 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ff6mq"] Mar 18 18:23:12 crc kubenswrapper[5008]: E0318 18:23:12.927972 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1fbe4-9cb2-41cd-938b-040adc62c26f" containerName="dnsmasq-dns" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.927989 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1fbe4-9cb2-41cd-938b-040adc62c26f" containerName="dnsmasq-dns" Mar 18 18:23:12 crc kubenswrapper[5008]: E0318 18:23:12.928001 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1fbe4-9cb2-41cd-938b-040adc62c26f" containerName="init" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.928007 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1fbe4-9cb2-41cd-938b-040adc62c26f" containerName="init" Mar 18 18:23:12 crc kubenswrapper[5008]: E0318 18:23:12.928021 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca61fbb-1714-4b6c-aca8-547813e7d581" containerName="keystone-db-sync" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.928029 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca61fbb-1714-4b6c-aca8-547813e7d581" containerName="keystone-db-sync" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.928178 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca61fbb-1714-4b6c-aca8-547813e7d581" containerName="keystone-db-sync" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.928193 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f1fbe4-9cb2-41cd-938b-040adc62c26f" containerName="dnsmasq-dns" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.928678 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.938820 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.939021 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.939107 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-87mjc" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.939215 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.939319 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.962525 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ff6mq"] Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.985298 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c6444c99c-4j7mc"] Mar 18 18:23:12 crc kubenswrapper[5008]: I0318 18:23:12.994083 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.000877 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c6444c99c-4j7mc"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093293 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-swift-storage-0\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093642 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-scripts\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093687 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flh8q\" (UniqueName: \"kubernetes.io/projected/dff13444-29d1-40b4-a711-8d263fb64743-kube-api-access-flh8q\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093707 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-svc\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093731 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-credential-keys\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093752 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-combined-ca-bundle\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093769 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-nb\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093791 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-fernet-keys\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093805 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-config\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093822 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2br8s\" (UniqueName: \"kubernetes.io/projected/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-kube-api-access-2br8s\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093841 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-sb\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.093887 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-config-data\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.163964 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.168590 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.171947 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.172201 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.186720 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.195837 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-svc\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196079 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-credential-keys\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196158 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-combined-ca-bundle\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196220 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-nb\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196297 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-fernet-keys\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196361 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2br8s\" (UniqueName: \"kubernetes.io/projected/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-kube-api-access-2br8s\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196420 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-config\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196486 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-sb\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196600 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-config-data\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196697 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-swift-storage-0\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196785 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-scripts\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.196871 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flh8q\" (UniqueName: \"kubernetes.io/projected/dff13444-29d1-40b4-a711-8d263fb64743-kube-api-access-flh8q\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.198342 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-svc\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.199339 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-config\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.201370 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-swift-storage-0\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.201661 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-nb\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.202170 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-sb\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.206143 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jbx2h"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.207190 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.215963 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-config-data\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.219239 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-scripts\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.241377 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-fernet-keys\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.241728 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-credential-keys\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.241975 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-combined-ca-bundle\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.242435 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-k8qml" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.242635 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.242705 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.250395 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flh8q\" (UniqueName: \"kubernetes.io/projected/dff13444-29d1-40b4-a711-8d263fb64743-kube-api-access-flh8q\") pod \"dnsmasq-dns-c6444c99c-4j7mc\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.267176 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2br8s\" (UniqueName: \"kubernetes.io/projected/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-kube-api-access-2br8s\") pod \"keystone-bootstrap-ff6mq\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.297858 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-combined-ca-bundle\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.297911 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-log-httpd\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.297942 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-etc-machine-id\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.297984 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-config-data\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.298002 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.298019 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl2gj\" (UniqueName: \"kubernetes.io/projected/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-kube-api-access-zl2gj\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.298061 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-scripts\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.298092 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-scripts\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.298105 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.298158 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-db-sync-config-data\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.298173 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-run-httpd\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.298193 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqj8\" (UniqueName: \"kubernetes.io/projected/a81efd6f-d370-4c91-9343-75f6e6d1e85d-kube-api-access-bpqj8\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.298210 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-config-data\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.323240 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.336046 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jbx2h"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.359641 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8x6nb"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.362221 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.372894 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.373117 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.373229 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dfntc" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.391030 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-887vw"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.392102 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.394074 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cdq7v" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.394227 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.399367 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407251 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-config-data\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407303 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407329 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl2gj\" (UniqueName: \"kubernetes.io/projected/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-kube-api-access-zl2gj\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407370 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-scripts\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407408 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-scripts\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407425 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407510 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-db-sync-config-data\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407532 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-run-httpd\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407575 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqj8\" (UniqueName: \"kubernetes.io/projected/a81efd6f-d370-4c91-9343-75f6e6d1e85d-kube-api-access-bpqj8\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407597 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-config-data\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407626 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-combined-ca-bundle\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407652 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-log-httpd\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407678 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-etc-machine-id\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.407775 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-etc-machine-id\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.429008 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-log-httpd\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.429455 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-run-httpd\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.432382 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-scripts\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.433122 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-config-data\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.434405 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-config-data\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.434534 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.435088 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-combined-ca-bundle\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.435144 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-db-sync-config-data\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.444491 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8x6nb"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.445298 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-scripts\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.448003 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.460262 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl2gj\" (UniqueName: \"kubernetes.io/projected/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-kube-api-access-zl2gj\") pod \"cinder-db-sync-jbx2h\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.475844 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqj8\" (UniqueName: \"kubernetes.io/projected/a81efd6f-d370-4c91-9343-75f6e6d1e85d-kube-api-access-bpqj8\") pod \"ceilometer-0\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.483591 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-887vw"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.486694 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.502628 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c6444c99c-4j7mc"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.519973 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906dac4b-4209-4b3b-b934-6804508c028b-logs\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.520416 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ntd\" (UniqueName: \"kubernetes.io/projected/906dac4b-4209-4b3b-b934-6804508c028b-kube-api-access-d7ntd\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.520446 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-config\") pod \"neutron-db-sync-887vw\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.520472 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-combined-ca-bundle\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.520513 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-config-data\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.520595 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8bns\" (UniqueName: \"kubernetes.io/projected/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-kube-api-access-n8bns\") pod \"neutron-db-sync-887vw\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.520631 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-combined-ca-bundle\") pod \"neutron-db-sync-887vw\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.520753 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-scripts\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.547931 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.559403 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-xdlt2"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.563280 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.605048 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-xdlt2"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.619922 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jgqxf"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.621372 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jgqxf"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.621502 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.623574 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-config-data\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.623627 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8bns\" (UniqueName: \"kubernetes.io/projected/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-kube-api-access-n8bns\") pod \"neutron-db-sync-887vw\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.623650 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-combined-ca-bundle\") pod \"neutron-db-sync-887vw\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.623726 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-scripts\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.623751 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906dac4b-4209-4b3b-b934-6804508c028b-logs\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.623770 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7ntd\" (UniqueName: \"kubernetes.io/projected/906dac4b-4209-4b3b-b934-6804508c028b-kube-api-access-d7ntd\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.623787 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-config\") pod \"neutron-db-sync-887vw\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.623808 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-combined-ca-bundle\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.627492 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.627728 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4xdkr" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.629482 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-combined-ca-bundle\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.631360 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-scripts\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.633274 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-combined-ca-bundle\") pod \"neutron-db-sync-887vw\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.633430 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906dac4b-4209-4b3b-b934-6804508c028b-logs\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.641640 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-config-data\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.643609 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" podUID="336cb77a-c6b8-4e03-851e-42e3cfff7ea6" containerName="dnsmasq-dns" containerID="cri-o://7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e" gracePeriod=10 Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.649603 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-config\") pod \"neutron-db-sync-887vw\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.651186 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.655489 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8bns\" (UniqueName: \"kubernetes.io/projected/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-kube-api-access-n8bns\") pod \"neutron-db-sync-887vw\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.656929 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7ntd\" (UniqueName: \"kubernetes.io/projected/906dac4b-4209-4b3b-b934-6804508c028b-kube-api-access-d7ntd\") pod \"placement-db-sync-8x6nb\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.726129 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-config\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.726184 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ql95\" (UniqueName: \"kubernetes.io/projected/315d5f4a-5139-47d4-8aaf-c3088d6eae91-kube-api-access-9ql95\") pod \"barbican-db-sync-jgqxf\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.726206 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-db-sync-config-data\") pod \"barbican-db-sync-jgqxf\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.726280 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjzlg\" (UniqueName: \"kubernetes.io/projected/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-kube-api-access-jjzlg\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.726307 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-nb\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.726327 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-svc\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.726352 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-sb\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.726392 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-swift-storage-0\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.726482 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-combined-ca-bundle\") pod \"barbican-db-sync-jgqxf\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.790812 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.802422 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.828044 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjzlg\" (UniqueName: \"kubernetes.io/projected/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-kube-api-access-jjzlg\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.828526 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-nb\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.828634 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-svc\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.828725 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-sb\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.829690 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-swift-storage-0\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.829718 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-combined-ca-bundle\") pod \"barbican-db-sync-jgqxf\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.829737 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-nb\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.829863 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-config\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.829941 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ql95\" (UniqueName: \"kubernetes.io/projected/315d5f4a-5139-47d4-8aaf-c3088d6eae91-kube-api-access-9ql95\") pod \"barbican-db-sync-jgqxf\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.829979 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-db-sync-config-data\") pod \"barbican-db-sync-jgqxf\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.831456 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-svc\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.834287 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-combined-ca-bundle\") pod \"barbican-db-sync-jgqxf\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.836720 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-db-sync-config-data\") pod \"barbican-db-sync-jgqxf\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.838776 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-config\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.839413 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-swift-storage-0\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.839505 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-sb\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.853468 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjzlg\" (UniqueName: \"kubernetes.io/projected/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-kube-api-access-jjzlg\") pod \"dnsmasq-dns-5d7fb48775-xdlt2\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.861420 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ql95\" (UniqueName: \"kubernetes.io/projected/315d5f4a-5139-47d4-8aaf-c3088d6eae91-kube-api-access-9ql95\") pod \"barbican-db-sync-jgqxf\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.903593 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.911225 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c6444c99c-4j7mc"] Mar 18 18:23:13 crc kubenswrapper[5008]: I0318 18:23:13.951591 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.086253 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.093434 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.102366 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.103100 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nh2zd" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.103313 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.103417 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.106287 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.113391 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.157726 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ff6mq"] Mar 18 18:23:14 crc kubenswrapper[5008]: W0318 18:23:14.182843 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4cf72df_2cbd_40ad_a19d_3b3fb02410b8.slice/crio-e81d84bda327dcd6a8ab6899382386f252987b3b4b973192036734b7b25ca411 WatchSource:0}: Error finding container e81d84bda327dcd6a8ab6899382386f252987b3b4b973192036734b7b25ca411: Status 404 returned error can't find the container with id e81d84bda327dcd6a8ab6899382386f252987b3b4b973192036734b7b25ca411 Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.192059 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.193387 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.196965 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.197183 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.208841 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.240041 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-config-data\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.240071 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57qzf\" (UniqueName: \"kubernetes.io/projected/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-kube-api-access-57qzf\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.240094 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-logs\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.240117 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.240139 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.240165 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-scripts\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.240209 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.240263 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.323697 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.341276 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jbx2h"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.341996 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-nb\") pod \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342117 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-swift-storage-0\") pod \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342165 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5vq6\" (UniqueName: \"kubernetes.io/projected/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-kube-api-access-v5vq6\") pod \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342197 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-svc\") pod \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342227 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-sb\") pod \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342297 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-config\") pod \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\" (UID: \"336cb77a-c6b8-4e03-851e-42e3cfff7ea6\") " Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342491 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342533 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342615 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-logs\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342638 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-config-data\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342661 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57qzf\" (UniqueName: \"kubernetes.io/projected/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-kube-api-access-57qzf\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342684 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-logs\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342721 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342743 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342762 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342789 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-scripts\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342806 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342829 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342850 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342874 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342893 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btrmn\" (UniqueName: \"kubernetes.io/projected/8dc36d57-9844-4986-946c-f3c5fd678ecd-kube-api-access-btrmn\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.342913 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.345844 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-logs\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.347442 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.348025 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.353661 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.354513 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-scripts\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.354909 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-config-data\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.369845 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.370042 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-kube-api-access-v5vq6" (OuterVolumeSpecName: "kube-api-access-v5vq6") pod "336cb77a-c6b8-4e03-851e-42e3cfff7ea6" (UID: "336cb77a-c6b8-4e03-851e-42e3cfff7ea6"). InnerVolumeSpecName "kube-api-access-v5vq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.401606 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57qzf\" (UniqueName: \"kubernetes.io/projected/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-kube-api-access-57qzf\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.427717 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "336cb77a-c6b8-4e03-851e-42e3cfff7ea6" (UID: "336cb77a-c6b8-4e03-851e-42e3cfff7ea6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444644 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444697 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444733 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444756 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444777 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btrmn\" (UniqueName: \"kubernetes.io/projected/8dc36d57-9844-4986-946c-f3c5fd678ecd-kube-api-access-btrmn\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444793 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444823 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444879 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-logs\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444940 5008 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.444952 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5vq6\" (UniqueName: \"kubernetes.io/projected/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-kube-api-access-v5vq6\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.445416 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-logs\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.445779 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.448210 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.448377 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.462802 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-config" (OuterVolumeSpecName: "config") pod "336cb77a-c6b8-4e03-851e-42e3cfff7ea6" (UID: "336cb77a-c6b8-4e03-851e-42e3cfff7ea6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.464938 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.465535 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.466476 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.487796 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.493191 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btrmn\" (UniqueName: \"kubernetes.io/projected/8dc36d57-9844-4986-946c-f3c5fd678ecd-kube-api-access-btrmn\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.495437 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "336cb77a-c6b8-4e03-851e-42e3cfff7ea6" (UID: "336cb77a-c6b8-4e03-851e-42e3cfff7ea6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.499470 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8x6nb"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.502908 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "336cb77a-c6b8-4e03-851e-42e3cfff7ea6" (UID: "336cb77a-c6b8-4e03-851e-42e3cfff7ea6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.516157 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "336cb77a-c6b8-4e03-851e-42e3cfff7ea6" (UID: "336cb77a-c6b8-4e03-851e-42e3cfff7ea6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.520516 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-887vw"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.529578 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.547892 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.548017 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.548035 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.548045 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/336cb77a-c6b8-4e03-851e-42e3cfff7ea6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:14 crc kubenswrapper[5008]: W0318 18:23:14.560084 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05c9a5fb_e4e1_4d69_b790_2f5890b62aa8.slice/crio-52b349ebc11089b61d282a1a50273e0220204eae5672408667d0702801193b82 WatchSource:0}: Error finding container 52b349ebc11089b61d282a1a50273e0220204eae5672408667d0702801193b82: Status 404 returned error can't find the container with id 52b349ebc11089b61d282a1a50273e0220204eae5672408667d0702801193b82 Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.612144 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jgqxf"] Mar 18 18:23:14 crc kubenswrapper[5008]: W0318 18:23:14.629073 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod315d5f4a_5139_47d4_8aaf_c3088d6eae91.slice/crio-978cd20e9a75a20d10cef9778acfffef3686b1fb723fb13e63b33c995f3c6272 WatchSource:0}: Error finding container 978cd20e9a75a20d10cef9778acfffef3686b1fb723fb13e63b33c995f3c6272: Status 404 returned error can't find the container with id 978cd20e9a75a20d10cef9778acfffef3686b1fb723fb13e63b33c995f3c6272 Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.636768 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.653988 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jbx2h" event={"ID":"bed54cd2-a411-4362-a7b1-7fab16ba8b6b","Type":"ContainerStarted","Data":"a82bf8d71775861ab06ef06297311574a5993f9c7bfe1583586d2e6b7d097e89"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.655424 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jgqxf" event={"ID":"315d5f4a-5139-47d4-8aaf-c3088d6eae91","Type":"ContainerStarted","Data":"978cd20e9a75a20d10cef9778acfffef3686b1fb723fb13e63b33c995f3c6272"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.656573 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerStarted","Data":"f520d91ca33efbade783543f5a1157f1b6b9c16207e72d2804145ad3fed49481"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.658241 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ff6mq" event={"ID":"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8","Type":"ContainerStarted","Data":"e81d84bda327dcd6a8ab6899382386f252987b3b4b973192036734b7b25ca411"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.660171 5008 generic.go:334] "Generic (PLEG): container finished" podID="dff13444-29d1-40b4-a711-8d263fb64743" containerID="d818a606e214913bfce864cd642dda4de227dda70516a4c01ec5c72c860f63d7" exitCode=0 Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.660217 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" event={"ID":"dff13444-29d1-40b4-a711-8d263fb64743","Type":"ContainerDied","Data":"d818a606e214913bfce864cd642dda4de227dda70516a4c01ec5c72c860f63d7"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.660241 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" event={"ID":"dff13444-29d1-40b4-a711-8d263fb64743","Type":"ContainerStarted","Data":"fa6d897e49444d21b8f703009e3a6d1078aaaa23a5e4d5f44b545537476fc781"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.663424 5008 generic.go:334] "Generic (PLEG): container finished" podID="336cb77a-c6b8-4e03-851e-42e3cfff7ea6" containerID="7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e" exitCode=0 Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.663716 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.665516 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" event={"ID":"336cb77a-c6b8-4e03-851e-42e3cfff7ea6","Type":"ContainerDied","Data":"7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.665645 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4f86d997-dknkr" event={"ID":"336cb77a-c6b8-4e03-851e-42e3cfff7ea6","Type":"ContainerDied","Data":"fa33a3ad789e1819ca6d16a24f28aaef1535212987cd9e899e4ead5a3cd6ed4a"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.665701 5008 scope.go:117] "RemoveContainer" containerID="7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.682922 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8x6nb" event={"ID":"906dac4b-4209-4b3b-b934-6804508c028b","Type":"ContainerStarted","Data":"47177af81029ecb448cd55ec70712c6959f33aa763b54c1c187d88d8faac0aea"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.698949 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-887vw" event={"ID":"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8","Type":"ContainerStarted","Data":"52b349ebc11089b61d282a1a50273e0220204eae5672408667d0702801193b82"} Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.713822 5008 scope.go:117] "RemoveContainer" containerID="ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.716665 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-xdlt2"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.727643 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d4f86d997-dknkr"] Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.736141 5008 scope.go:117] "RemoveContainer" containerID="7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.737103 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:23:14 crc kubenswrapper[5008]: E0318 18:23:14.739669 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e\": container with ID starting with 7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e not found: ID does not exist" containerID="7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.739702 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e"} err="failed to get container status \"7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e\": rpc error: code = NotFound desc = could not find container \"7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e\": container with ID starting with 7603b0d9152ec0a915f053943f5c72d723c25fb6a1748180a86d1098b601155e not found: ID does not exist" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.739725 5008 scope.go:117] "RemoveContainer" containerID="ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0" Mar 18 18:23:14 crc kubenswrapper[5008]: E0318 18:23:14.739951 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0\": container with ID starting with ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0 not found: ID does not exist" containerID="ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.739966 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0"} err="failed to get container status \"ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0\": rpc error: code = NotFound desc = could not find container \"ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0\": container with ID starting with ffb328bd24ce1a8076680623337bc7b4269c3b5241b284e26145bf919d8bc4e0 not found: ID does not exist" Mar 18 18:23:14 crc kubenswrapper[5008]: I0318 18:23:14.751218 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d4f86d997-dknkr"] Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.040608 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.166457 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-config\") pod \"dff13444-29d1-40b4-a711-8d263fb64743\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.166525 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-sb\") pod \"dff13444-29d1-40b4-a711-8d263fb64743\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.166577 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flh8q\" (UniqueName: \"kubernetes.io/projected/dff13444-29d1-40b4-a711-8d263fb64743-kube-api-access-flh8q\") pod \"dff13444-29d1-40b4-a711-8d263fb64743\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.166631 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-nb\") pod \"dff13444-29d1-40b4-a711-8d263fb64743\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.166655 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-svc\") pod \"dff13444-29d1-40b4-a711-8d263fb64743\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.166674 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-swift-storage-0\") pod \"dff13444-29d1-40b4-a711-8d263fb64743\" (UID: \"dff13444-29d1-40b4-a711-8d263fb64743\") " Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.171534 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff13444-29d1-40b4-a711-8d263fb64743-kube-api-access-flh8q" (OuterVolumeSpecName: "kube-api-access-flh8q") pod "dff13444-29d1-40b4-a711-8d263fb64743" (UID: "dff13444-29d1-40b4-a711-8d263fb64743"). InnerVolumeSpecName "kube-api-access-flh8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.214824 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dff13444-29d1-40b4-a711-8d263fb64743" (UID: "dff13444-29d1-40b4-a711-8d263fb64743"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.268771 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flh8q\" (UniqueName: \"kubernetes.io/projected/dff13444-29d1-40b4-a711-8d263fb64743-kube-api-access-flh8q\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.269105 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.270088 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dff13444-29d1-40b4-a711-8d263fb64743" (UID: "dff13444-29d1-40b4-a711-8d263fb64743"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.278623 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dff13444-29d1-40b4-a711-8d263fb64743" (UID: "dff13444-29d1-40b4-a711-8d263fb64743"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.315920 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-config" (OuterVolumeSpecName: "config") pod "dff13444-29d1-40b4-a711-8d263fb64743" (UID: "dff13444-29d1-40b4-a711-8d263fb64743"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.319073 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dff13444-29d1-40b4-a711-8d263fb64743" (UID: "dff13444-29d1-40b4-a711-8d263fb64743"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.354909 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.370279 5008 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.370439 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.370496 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.370547 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dff13444-29d1-40b4-a711-8d263fb64743-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.547426 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:23:15 crc kubenswrapper[5008]: W0318 18:23:15.564771 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8df7f5ad_81e9_4e27_b9d9_d6ad3959b610.slice/crio-ee1315d562e5e75a87fc5e1d84c387010485a9a28b689562a76d409eefa86c02 WatchSource:0}: Error finding container ee1315d562e5e75a87fc5e1d84c387010485a9a28b689562a76d409eefa86c02: Status 404 returned error can't find the container with id ee1315d562e5e75a87fc5e1d84c387010485a9a28b689562a76d409eefa86c02 Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.702789 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.777183 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.781503 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" event={"ID":"dff13444-29d1-40b4-a711-8d263fb64743","Type":"ContainerDied","Data":"fa6d897e49444d21b8f703009e3a6d1078aaaa23a5e4d5f44b545537476fc781"} Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.781645 5008 scope.go:117] "RemoveContainer" containerID="d818a606e214913bfce864cd642dda4de227dda70516a4c01ec5c72c860f63d7" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.781730 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6444c99c-4j7mc" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.786629 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.807877 5008 generic.go:334] "Generic (PLEG): container finished" podID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" containerID="80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073" exitCode=0 Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.807938 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" event={"ID":"f66df63f-e2b6-4f43-b3c0-9ebb157b0693","Type":"ContainerDied","Data":"80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073"} Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.807962 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" event={"ID":"f66df63f-e2b6-4f43-b3c0-9ebb157b0693","Type":"ContainerStarted","Data":"440e155fcc78b52bc476f90e706a59717c932602bb9edc57eaf75db097115d05"} Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.813782 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dc36d57-9844-4986-946c-f3c5fd678ecd","Type":"ContainerStarted","Data":"6036c7a45277a503b50fbcc89e6f0b223e6a3b2a9ebb923a5a69a2a693a3bfa5"} Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.827383 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-887vw" event={"ID":"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8","Type":"ContainerStarted","Data":"583ea8f91c610b42793eda14286b85f2452817e74199e6c8c7a0515bdd26747e"} Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.830996 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610","Type":"ContainerStarted","Data":"ee1315d562e5e75a87fc5e1d84c387010485a9a28b689562a76d409eefa86c02"} Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.845392 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ff6mq" event={"ID":"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8","Type":"ContainerStarted","Data":"081f3bead1fb53eb0fef72bd4a02d8ed9f8351a7fc981808236fa647e89e8a7d"} Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.875631 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-887vw" podStartSLOduration=2.875606704 podStartE2EDuration="2.875606704s" podCreationTimestamp="2026-03-18 18:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:15.85381628 +0000 UTC m=+1252.373289359" watchObservedRunningTime="2026-03-18 18:23:15.875606704 +0000 UTC m=+1252.395079793" Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.942246 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c6444c99c-4j7mc"] Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.952319 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c6444c99c-4j7mc"] Mar 18 18:23:15 crc kubenswrapper[5008]: I0318 18:23:15.952378 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ff6mq" podStartSLOduration=3.952358822 podStartE2EDuration="3.952358822s" podCreationTimestamp="2026-03-18 18:23:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:15.935996383 +0000 UTC m=+1252.455469462" watchObservedRunningTime="2026-03-18 18:23:15.952358822 +0000 UTC m=+1252.471831901" Mar 18 18:23:16 crc kubenswrapper[5008]: I0318 18:23:16.238056 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336cb77a-c6b8-4e03-851e-42e3cfff7ea6" path="/var/lib/kubelet/pods/336cb77a-c6b8-4e03-851e-42e3cfff7ea6/volumes" Mar 18 18:23:16 crc kubenswrapper[5008]: I0318 18:23:16.240291 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff13444-29d1-40b4-a711-8d263fb64743" path="/var/lib/kubelet/pods/dff13444-29d1-40b4-a711-8d263fb64743/volumes" Mar 18 18:23:16 crc kubenswrapper[5008]: I0318 18:23:16.871066 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610","Type":"ContainerStarted","Data":"3c7b30d562f8fe920a7718a5d1e21d95fd6473a6216529c345d0522f2eaba9b2"} Mar 18 18:23:16 crc kubenswrapper[5008]: I0318 18:23:16.881943 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" event={"ID":"f66df63f-e2b6-4f43-b3c0-9ebb157b0693","Type":"ContainerStarted","Data":"43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276"} Mar 18 18:23:16 crc kubenswrapper[5008]: I0318 18:23:16.882949 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:16 crc kubenswrapper[5008]: I0318 18:23:16.885476 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dc36d57-9844-4986-946c-f3c5fd678ecd","Type":"ContainerStarted","Data":"4bda19fda517832cd7ef7bacb245df37d2fc34d0da8ce49c4ae7c20ef849bfc6"} Mar 18 18:23:16 crc kubenswrapper[5008]: I0318 18:23:16.906235 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" podStartSLOduration=3.906149834 podStartE2EDuration="3.906149834s" podCreationTimestamp="2026-03-18 18:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:16.905767284 +0000 UTC m=+1253.425240363" watchObservedRunningTime="2026-03-18 18:23:16.906149834 +0000 UTC m=+1253.425622913" Mar 18 18:23:17 crc kubenswrapper[5008]: I0318 18:23:17.897313 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dc36d57-9844-4986-946c-f3c5fd678ecd","Type":"ContainerStarted","Data":"eeacb404fbc0d0dec44c79173d40381e16d6d31b61813b097d343131cc9bb34f"} Mar 18 18:23:17 crc kubenswrapper[5008]: I0318 18:23:17.897429 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerName="glance-log" containerID="cri-o://4bda19fda517832cd7ef7bacb245df37d2fc34d0da8ce49c4ae7c20ef849bfc6" gracePeriod=30 Mar 18 18:23:17 crc kubenswrapper[5008]: I0318 18:23:17.897488 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerName="glance-httpd" containerID="cri-o://eeacb404fbc0d0dec44c79173d40381e16d6d31b61813b097d343131cc9bb34f" gracePeriod=30 Mar 18 18:23:17 crc kubenswrapper[5008]: I0318 18:23:17.904718 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610","Type":"ContainerStarted","Data":"0630c31033dabebeadb5c81a689ec8d9acd96ecf276eb8f3cc86b0941d0a11fd"} Mar 18 18:23:17 crc kubenswrapper[5008]: I0318 18:23:17.904958 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerName="glance-httpd" containerID="cri-o://0630c31033dabebeadb5c81a689ec8d9acd96ecf276eb8f3cc86b0941d0a11fd" gracePeriod=30 Mar 18 18:23:17 crc kubenswrapper[5008]: I0318 18:23:17.904965 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerName="glance-log" containerID="cri-o://3c7b30d562f8fe920a7718a5d1e21d95fd6473a6216529c345d0522f2eaba9b2" gracePeriod=30 Mar 18 18:23:17 crc kubenswrapper[5008]: I0318 18:23:17.936344 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.936322973 podStartE2EDuration="4.936322973s" podCreationTimestamp="2026-03-18 18:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:17.930785495 +0000 UTC m=+1254.450258604" watchObservedRunningTime="2026-03-18 18:23:17.936322973 +0000 UTC m=+1254.455796052" Mar 18 18:23:17 crc kubenswrapper[5008]: I0318 18:23:17.960644 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.96062623 podStartE2EDuration="4.96062623s" podCreationTimestamp="2026-03-18 18:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:17.954321513 +0000 UTC m=+1254.473794612" watchObservedRunningTime="2026-03-18 18:23:17.96062623 +0000 UTC m=+1254.480099309" Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.924421 5008 generic.go:334] "Generic (PLEG): container finished" podID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerID="0630c31033dabebeadb5c81a689ec8d9acd96ecf276eb8f3cc86b0941d0a11fd" exitCode=143 Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.924868 5008 generic.go:334] "Generic (PLEG): container finished" podID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerID="3c7b30d562f8fe920a7718a5d1e21d95fd6473a6216529c345d0522f2eaba9b2" exitCode=143 Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.924505 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610","Type":"ContainerDied","Data":"0630c31033dabebeadb5c81a689ec8d9acd96ecf276eb8f3cc86b0941d0a11fd"} Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.924989 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610","Type":"ContainerDied","Data":"3c7b30d562f8fe920a7718a5d1e21d95fd6473a6216529c345d0522f2eaba9b2"} Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.927241 5008 generic.go:334] "Generic (PLEG): container finished" podID="d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" containerID="081f3bead1fb53eb0fef72bd4a02d8ed9f8351a7fc981808236fa647e89e8a7d" exitCode=0 Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.927331 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ff6mq" event={"ID":"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8","Type":"ContainerDied","Data":"081f3bead1fb53eb0fef72bd4a02d8ed9f8351a7fc981808236fa647e89e8a7d"} Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.931871 5008 generic.go:334] "Generic (PLEG): container finished" podID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerID="eeacb404fbc0d0dec44c79173d40381e16d6d31b61813b097d343131cc9bb34f" exitCode=0 Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.931906 5008 generic.go:334] "Generic (PLEG): container finished" podID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerID="4bda19fda517832cd7ef7bacb245df37d2fc34d0da8ce49c4ae7c20ef849bfc6" exitCode=143 Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.931953 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dc36d57-9844-4986-946c-f3c5fd678ecd","Type":"ContainerDied","Data":"eeacb404fbc0d0dec44c79173d40381e16d6d31b61813b097d343131cc9bb34f"} Mar 18 18:23:18 crc kubenswrapper[5008]: I0318 18:23:18.931997 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dc36d57-9844-4986-946c-f3c5fd678ecd","Type":"ContainerDied","Data":"4bda19fda517832cd7ef7bacb245df37d2fc34d0da8ce49c4ae7c20ef849bfc6"} Mar 18 18:23:23 crc kubenswrapper[5008]: I0318 18:23:23.906747 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:24 crc kubenswrapper[5008]: I0318 18:23:24.014515 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-8xdxz"] Mar 18 18:23:24 crc kubenswrapper[5008]: I0318 18:23:24.014806 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerName="dnsmasq-dns" containerID="cri-o://16f326ba4481d5794a3cf862529549c6f7b6bc1c7a74670132db25e791e7ddc5" gracePeriod=10 Mar 18 18:23:24 crc kubenswrapper[5008]: I0318 18:23:24.997004 5008 generic.go:334] "Generic (PLEG): container finished" podID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerID="16f326ba4481d5794a3cf862529549c6f7b6bc1c7a74670132db25e791e7ddc5" exitCode=0 Mar 18 18:23:24 crc kubenswrapper[5008]: I0318 18:23:24.997086 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" event={"ID":"9b085aa0-d1ca-47a4-9b12-588dc9be67fe","Type":"ContainerDied","Data":"16f326ba4481d5794a3cf862529549c6f7b6bc1c7a74670132db25e791e7ddc5"} Mar 18 18:23:25 crc kubenswrapper[5008]: I0318 18:23:25.992807 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.002449 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.009923 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610","Type":"ContainerDied","Data":"ee1315d562e5e75a87fc5e1d84c387010485a9a28b689562a76d409eefa86c02"} Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.010314 5008 scope.go:117] "RemoveContainer" containerID="0630c31033dabebeadb5c81a689ec8d9acd96ecf276eb8f3cc86b0941d0a11fd" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.010851 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.039745 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ff6mq" event={"ID":"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8","Type":"ContainerDied","Data":"e81d84bda327dcd6a8ab6899382386f252987b3b4b973192036734b7b25ca411"} Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.039792 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e81d84bda327dcd6a8ab6899382386f252987b3b4b973192036734b7b25ca411" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.039850 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ff6mq" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.112305 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-httpd-run\") pod \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.112370 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2br8s\" (UniqueName: \"kubernetes.io/projected/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-kube-api-access-2br8s\") pod \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.112398 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.112475 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-combined-ca-bundle\") pod \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.112513 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-scripts\") pod \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.112543 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-public-tls-certs\") pod \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.113143 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" (UID: "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.113525 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57qzf\" (UniqueName: \"kubernetes.io/projected/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-kube-api-access-57qzf\") pod \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.113577 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-fernet-keys\") pod \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.113622 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-config-data\") pod \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.113638 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-combined-ca-bundle\") pod \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.113684 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-scripts\") pod \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.113722 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-credential-keys\") pod \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\" (UID: \"d4cf72df-2cbd-40ad-a19d-3b3fb02410b8\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.113756 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-logs\") pod \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.113786 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-config-data\") pod \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\" (UID: \"8df7f5ad-81e9-4e27-b9d9-d6ad3959b610\") " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.114263 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.119295 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-kube-api-access-57qzf" (OuterVolumeSpecName: "kube-api-access-57qzf") pod "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" (UID: "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610"). InnerVolumeSpecName "kube-api-access-57qzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.119549 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.120136 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-logs" (OuterVolumeSpecName: "logs") pod "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" (UID: "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.125282 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" (UID: "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.126219 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" (UID: "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.126943 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-scripts" (OuterVolumeSpecName: "scripts") pod "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" (UID: "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.128863 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-kube-api-access-2br8s" (OuterVolumeSpecName: "kube-api-access-2br8s") pod "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" (UID: "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8"). InnerVolumeSpecName "kube-api-access-2br8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.134853 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" (UID: "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.155514 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-scripts" (OuterVolumeSpecName: "scripts") pod "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" (UID: "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.160786 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" (UID: "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.169953 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" (UID: "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.174504 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-config-data" (OuterVolumeSpecName: "config-data") pod "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" (UID: "d4cf72df-2cbd-40ad-a19d-3b3fb02410b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.186832 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" (UID: "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.198644 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-config-data" (OuterVolumeSpecName: "config-data") pod "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" (UID: "8df7f5ad-81e9-4e27-b9d9-d6ad3959b610"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215521 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215569 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215583 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215595 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57qzf\" (UniqueName: \"kubernetes.io/projected/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-kube-api-access-57qzf\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215607 5008 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215619 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215631 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215641 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215653 5008 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215664 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215674 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215685 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2br8s\" (UniqueName: \"kubernetes.io/projected/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8-kube-api-access-2br8s\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.215720 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.240054 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.317169 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.354367 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.361956 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.375908 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:23:26 crc kubenswrapper[5008]: E0318 18:23:26.376243 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerName="glance-httpd" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376255 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerName="glance-httpd" Mar 18 18:23:26 crc kubenswrapper[5008]: E0318 18:23:26.376268 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" containerName="keystone-bootstrap" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376274 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" containerName="keystone-bootstrap" Mar 18 18:23:26 crc kubenswrapper[5008]: E0318 18:23:26.376284 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336cb77a-c6b8-4e03-851e-42e3cfff7ea6" containerName="dnsmasq-dns" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376290 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="336cb77a-c6b8-4e03-851e-42e3cfff7ea6" containerName="dnsmasq-dns" Mar 18 18:23:26 crc kubenswrapper[5008]: E0318 18:23:26.376303 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff13444-29d1-40b4-a711-8d263fb64743" containerName="init" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376309 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff13444-29d1-40b4-a711-8d263fb64743" containerName="init" Mar 18 18:23:26 crc kubenswrapper[5008]: E0318 18:23:26.376317 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerName="glance-log" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376323 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerName="glance-log" Mar 18 18:23:26 crc kubenswrapper[5008]: E0318 18:23:26.376331 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336cb77a-c6b8-4e03-851e-42e3cfff7ea6" containerName="init" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376336 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="336cb77a-c6b8-4e03-851e-42e3cfff7ea6" containerName="init" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376491 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff13444-29d1-40b4-a711-8d263fb64743" containerName="init" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376504 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerName="glance-httpd" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376516 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" containerName="keystone-bootstrap" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376523 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" containerName="glance-log" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.376531 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="336cb77a-c6b8-4e03-851e-42e3cfff7ea6" containerName="dnsmasq-dns" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.377586 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.379988 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.380371 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.396759 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.520009 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-config-data\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.520075 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdhs\" (UniqueName: \"kubernetes.io/projected/31a8e3a3-716a-49ec-962e-25580e47f4a6-kube-api-access-7pdhs\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.520100 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.520137 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.520179 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.520232 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-scripts\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.520252 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-logs\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.520268 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.621695 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-scripts\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.621742 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-logs\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.621760 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.621810 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-config-data\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.621848 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdhs\" (UniqueName: \"kubernetes.io/projected/31a8e3a3-716a-49ec-962e-25580e47f4a6-kube-api-access-7pdhs\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.621866 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.621902 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.621939 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.623062 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-logs\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.623608 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.624264 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.626796 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.627284 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-config-data\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.627521 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.627754 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-scripts\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.640871 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdhs\" (UniqueName: \"kubernetes.io/projected/31a8e3a3-716a-49ec-962e-25580e47f4a6-kube-api-access-7pdhs\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.652470 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " pod="openstack/glance-default-external-api-0" Mar 18 18:23:26 crc kubenswrapper[5008]: I0318 18:23:26.704172 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.122665 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ff6mq"] Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.131836 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ff6mq"] Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.209917 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rsqds"] Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.211147 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.216236 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.216325 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-87mjc" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.216426 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.216482 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.216592 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.227186 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rsqds"] Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.335080 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsnjl\" (UniqueName: \"kubernetes.io/projected/9c1a8234-533f-4cd5-9517-52acab86e99f-kube-api-access-lsnjl\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.335601 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-scripts\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.335859 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-combined-ca-bundle\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.336007 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-config-data\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.336138 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-fernet-keys\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.336328 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-credential-keys\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.437374 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-credential-keys\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.437422 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsnjl\" (UniqueName: \"kubernetes.io/projected/9c1a8234-533f-4cd5-9517-52acab86e99f-kube-api-access-lsnjl\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.437478 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-scripts\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.437517 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-combined-ca-bundle\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.437536 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-config-data\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.437580 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-fernet-keys\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.442112 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-credential-keys\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.443132 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-config-data\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.443224 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-scripts\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.444923 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-fernet-keys\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.448362 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-combined-ca-bundle\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.456352 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsnjl\" (UniqueName: \"kubernetes.io/projected/9c1a8234-533f-4cd5-9517-52acab86e99f-kube-api-access-lsnjl\") pod \"keystone-bootstrap-rsqds\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:27 crc kubenswrapper[5008]: I0318 18:23:27.527396 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:28 crc kubenswrapper[5008]: I0318 18:23:28.209907 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df7f5ad-81e9-4e27-b9d9-d6ad3959b610" path="/var/lib/kubelet/pods/8df7f5ad-81e9-4e27-b9d9-d6ad3959b610/volumes" Mar 18 18:23:28 crc kubenswrapper[5008]: I0318 18:23:28.210980 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4cf72df-2cbd-40ad-a19d-3b3fb02410b8" path="/var/lib/kubelet/pods/d4cf72df-2cbd-40ad-a19d-3b3fb02410b8/volumes" Mar 18 18:23:31 crc kubenswrapper[5008]: I0318 18:23:31.116031 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.112768 5008 generic.go:334] "Generic (PLEG): container finished" podID="05c9a5fb-e4e1-4d69-b790-2f5890b62aa8" containerID="583ea8f91c610b42793eda14286b85f2452817e74199e6c8c7a0515bdd26747e" exitCode=0 Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.112857 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-887vw" event={"ID":"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8","Type":"ContainerDied","Data":"583ea8f91c610b42793eda14286b85f2452817e74199e6c8c7a0515bdd26747e"} Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.601703 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.785575 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-combined-ca-bundle\") pod \"8dc36d57-9844-4986-946c-f3c5fd678ecd\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.785956 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-config-data\") pod \"8dc36d57-9844-4986-946c-f3c5fd678ecd\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.786003 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-internal-tls-certs\") pod \"8dc36d57-9844-4986-946c-f3c5fd678ecd\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.786051 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-httpd-run\") pod \"8dc36d57-9844-4986-946c-f3c5fd678ecd\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.786101 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"8dc36d57-9844-4986-946c-f3c5fd678ecd\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.786674 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8dc36d57-9844-4986-946c-f3c5fd678ecd" (UID: "8dc36d57-9844-4986-946c-f3c5fd678ecd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.786774 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-logs\") pod \"8dc36d57-9844-4986-946c-f3c5fd678ecd\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.787079 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-logs" (OuterVolumeSpecName: "logs") pod "8dc36d57-9844-4986-946c-f3c5fd678ecd" (UID: "8dc36d57-9844-4986-946c-f3c5fd678ecd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.787159 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-scripts\") pod \"8dc36d57-9844-4986-946c-f3c5fd678ecd\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.787659 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btrmn\" (UniqueName: \"kubernetes.io/projected/8dc36d57-9844-4986-946c-f3c5fd678ecd-kube-api-access-btrmn\") pod \"8dc36d57-9844-4986-946c-f3c5fd678ecd\" (UID: \"8dc36d57-9844-4986-946c-f3c5fd678ecd\") " Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.788446 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.788482 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dc36d57-9844-4986-946c-f3c5fd678ecd-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.790481 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "8dc36d57-9844-4986-946c-f3c5fd678ecd" (UID: "8dc36d57-9844-4986-946c-f3c5fd678ecd"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.791772 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-scripts" (OuterVolumeSpecName: "scripts") pod "8dc36d57-9844-4986-946c-f3c5fd678ecd" (UID: "8dc36d57-9844-4986-946c-f3c5fd678ecd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.791851 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc36d57-9844-4986-946c-f3c5fd678ecd-kube-api-access-btrmn" (OuterVolumeSpecName: "kube-api-access-btrmn") pod "8dc36d57-9844-4986-946c-f3c5fd678ecd" (UID: "8dc36d57-9844-4986-946c-f3c5fd678ecd"). InnerVolumeSpecName "kube-api-access-btrmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.809878 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dc36d57-9844-4986-946c-f3c5fd678ecd" (UID: "8dc36d57-9844-4986-946c-f3c5fd678ecd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.834731 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-config-data" (OuterVolumeSpecName: "config-data") pod "8dc36d57-9844-4986-946c-f3c5fd678ecd" (UID: "8dc36d57-9844-4986-946c-f3c5fd678ecd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.848016 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8dc36d57-9844-4986-946c-f3c5fd678ecd" (UID: "8dc36d57-9844-4986-946c-f3c5fd678ecd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.890023 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.890082 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.890098 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.890111 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btrmn\" (UniqueName: \"kubernetes.io/projected/8dc36d57-9844-4986-946c-f3c5fd678ecd-kube-api-access-btrmn\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.890122 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.890130 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc36d57-9844-4986-946c-f3c5fd678ecd-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.906806 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 18:23:34 crc kubenswrapper[5008]: I0318 18:23:34.992166 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.123854 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8dc36d57-9844-4986-946c-f3c5fd678ecd","Type":"ContainerDied","Data":"6036c7a45277a503b50fbcc89e6f0b223e6a3b2a9ebb923a5a69a2a693a3bfa5"} Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.123883 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.176412 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.189770 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:23:35 crc kubenswrapper[5008]: E0318 18:23:35.210667 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a" Mar 18 18:23:35 crc kubenswrapper[5008]: E0318 18:23:35.210819 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9ql95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-jgqxf_openstack(315d5f4a-5139-47d4-8aaf-c3088d6eae91): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:23:35 crc kubenswrapper[5008]: E0318 18:23:35.212919 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-jgqxf" podUID="315d5f4a-5139-47d4-8aaf-c3088d6eae91" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.229030 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:23:35 crc kubenswrapper[5008]: E0318 18:23:35.229389 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerName="glance-httpd" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.229405 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerName="glance-httpd" Mar 18 18:23:35 crc kubenswrapper[5008]: E0318 18:23:35.229429 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerName="glance-log" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.229435 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerName="glance-log" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.229607 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerName="glance-httpd" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.229617 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc36d57-9844-4986-946c-f3c5fd678ecd" containerName="glance-log" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.236810 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.246968 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.247050 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.247890 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.403757 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.403804 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.403827 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.403858 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.403904 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.403926 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.403944 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwbh8\" (UniqueName: \"kubernetes.io/projected/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-kube-api-access-bwbh8\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.403960 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.505296 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.505727 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.505768 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwbh8\" (UniqueName: \"kubernetes.io/projected/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-kube-api-access-bwbh8\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.505792 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.505888 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.505914 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.505918 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.505974 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.506019 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.506101 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.506349 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.512852 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.512866 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.513302 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.521562 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.525490 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwbh8\" (UniqueName: \"kubernetes.io/projected/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-kube-api-access-bwbh8\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.533471 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:23:35 crc kubenswrapper[5008]: I0318 18:23:35.588456 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:36 crc kubenswrapper[5008]: E0318 18:23:36.136995 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a\\\"\"" pod="openstack/barbican-db-sync-jgqxf" podUID="315d5f4a-5139-47d4-8aaf-c3088d6eae91" Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.214048 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dc36d57-9844-4986-946c-f3c5fd678ecd" path="/var/lib/kubelet/pods/8dc36d57-9844-4986-946c-f3c5fd678ecd/volumes" Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.519675 5008 scope.go:117] "RemoveContainer" containerID="3c7b30d562f8fe920a7718a5d1e21d95fd6473a6216529c345d0522f2eaba9b2" Mar 18 18:23:36 crc kubenswrapper[5008]: E0318 18:23:36.543451 5008 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 18 18:23:36 crc kubenswrapper[5008]: E0318 18:23:36.543858 5008 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zl2gj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jbx2h_openstack(bed54cd2-a411-4362-a7b1-7fab16ba8b6b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 18:23:36 crc kubenswrapper[5008]: E0318 18:23:36.545178 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jbx2h" podUID="bed54cd2-a411-4362-a7b1-7fab16ba8b6b" Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.759784 5008 scope.go:117] "RemoveContainer" containerID="eeacb404fbc0d0dec44c79173d40381e16d6d31b61813b097d343131cc9bb34f" Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.841355 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.865198 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.871978 5008 scope.go:117] "RemoveContainer" containerID="4bda19fda517832cd7ef7bacb245df37d2fc34d0da8ce49c4ae7c20ef849bfc6" Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.937249 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dwpw\" (UniqueName: \"kubernetes.io/projected/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-kube-api-access-6dwpw\") pod \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.937342 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8bns\" (UniqueName: \"kubernetes.io/projected/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-kube-api-access-n8bns\") pod \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.937412 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-config\") pod \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.937434 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-nb\") pod \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.937489 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-combined-ca-bundle\") pod \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\" (UID: \"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8\") " Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.937549 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-dns-svc\") pod \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.937638 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-sb\") pod \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.937661 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-config\") pod \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\" (UID: \"9b085aa0-d1ca-47a4-9b12-588dc9be67fe\") " Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.945328 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-kube-api-access-6dwpw" (OuterVolumeSpecName: "kube-api-access-6dwpw") pod "9b085aa0-d1ca-47a4-9b12-588dc9be67fe" (UID: "9b085aa0-d1ca-47a4-9b12-588dc9be67fe"). InnerVolumeSpecName "kube-api-access-6dwpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:36 crc kubenswrapper[5008]: I0318 18:23:36.956384 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-kube-api-access-n8bns" (OuterVolumeSpecName: "kube-api-access-n8bns") pod "05c9a5fb-e4e1-4d69-b790-2f5890b62aa8" (UID: "05c9a5fb-e4e1-4d69-b790-2f5890b62aa8"). InnerVolumeSpecName "kube-api-access-n8bns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.001349 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05c9a5fb-e4e1-4d69-b790-2f5890b62aa8" (UID: "05c9a5fb-e4e1-4d69-b790-2f5890b62aa8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.017623 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-config" (OuterVolumeSpecName: "config") pod "05c9a5fb-e4e1-4d69-b790-2f5890b62aa8" (UID: "05c9a5fb-e4e1-4d69-b790-2f5890b62aa8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.039877 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dwpw\" (UniqueName: \"kubernetes.io/projected/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-kube-api-access-6dwpw\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.039912 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8bns\" (UniqueName: \"kubernetes.io/projected/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-kube-api-access-n8bns\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.039923 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.039938 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.048075 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rsqds"] Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.051356 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-config" (OuterVolumeSpecName: "config") pod "9b085aa0-d1ca-47a4-9b12-588dc9be67fe" (UID: "9b085aa0-d1ca-47a4-9b12-588dc9be67fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.052448 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b085aa0-d1ca-47a4-9b12-588dc9be67fe" (UID: "9b085aa0-d1ca-47a4-9b12-588dc9be67fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.064290 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b085aa0-d1ca-47a4-9b12-588dc9be67fe" (UID: "9b085aa0-d1ca-47a4-9b12-588dc9be67fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.069575 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b085aa0-d1ca-47a4-9b12-588dc9be67fe" (UID: "9b085aa0-d1ca-47a4-9b12-588dc9be67fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.141267 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.141597 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.141610 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.141621 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b085aa0-d1ca-47a4-9b12-588dc9be67fe-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.154054 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerStarted","Data":"a00d2e33d5a611eb348a15725223f5f013cb2fd357ace420a3b26bc02145ca80"} Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.162260 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" event={"ID":"9b085aa0-d1ca-47a4-9b12-588dc9be67fe","Type":"ContainerDied","Data":"2ed2580cbeeff737c4d9a1292396a4ee8b39ba25e773cc3c4a51adce9e93a7cd"} Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.162302 5008 scope.go:117] "RemoveContainer" containerID="16f326ba4481d5794a3cf862529549c6f7b6bc1c7a74670132db25e791e7ddc5" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.162387 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.176768 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsqds" event={"ID":"9c1a8234-533f-4cd5-9517-52acab86e99f","Type":"ContainerStarted","Data":"e90ae921b04a6d778ad146df03b17c4e9258fa5fd28614f5f99805e62e7e5721"} Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.177097 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.181924 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8x6nb" event={"ID":"906dac4b-4209-4b3b-b934-6804508c028b","Type":"ContainerStarted","Data":"691a4cffba40335c2f67b7e5844c2b3a8e32108a6a47bfa2c66e10562fd31321"} Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.192825 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-887vw" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.192883 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-887vw" event={"ID":"05c9a5fb-e4e1-4d69-b790-2f5890b62aa8","Type":"ContainerDied","Data":"52b349ebc11089b61d282a1a50273e0220204eae5672408667d0702801193b82"} Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.192926 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52b349ebc11089b61d282a1a50273e0220204eae5672408667d0702801193b82" Mar 18 18:23:37 crc kubenswrapper[5008]: E0318 18:23:37.202910 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-jbx2h" podUID="bed54cd2-a411-4362-a7b1-7fab16ba8b6b" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.203008 5008 scope.go:117] "RemoveContainer" containerID="34521d9f15a48c44fcbedade648e3cbe3d7738b22fb676c0b85404a2785ed4ff" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.208917 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8x6nb" podStartSLOduration=4.266107366 podStartE2EDuration="24.208902623s" podCreationTimestamp="2026-03-18 18:23:13 +0000 UTC" firstStartedPulling="2026-03-18 18:23:14.528885314 +0000 UTC m=+1251.048358393" lastFinishedPulling="2026-03-18 18:23:34.471680521 +0000 UTC m=+1270.991153650" observedRunningTime="2026-03-18 18:23:37.20317254 +0000 UTC m=+1273.722645619" watchObservedRunningTime="2026-03-18 18:23:37.208902623 +0000 UTC m=+1273.728375702" Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.257288 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-8xdxz"] Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.268177 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-8xdxz"] Mar 18 18:23:37 crc kubenswrapper[5008]: I0318 18:23:37.276458 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.079613 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-s6tln"] Mar 18 18:23:38 crc kubenswrapper[5008]: E0318 18:23:38.080293 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerName="dnsmasq-dns" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.080309 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerName="dnsmasq-dns" Mar 18 18:23:38 crc kubenswrapper[5008]: E0318 18:23:38.080330 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c9a5fb-e4e1-4d69-b790-2f5890b62aa8" containerName="neutron-db-sync" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.080337 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c9a5fb-e4e1-4d69-b790-2f5890b62aa8" containerName="neutron-db-sync" Mar 18 18:23:38 crc kubenswrapper[5008]: E0318 18:23:38.080356 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerName="init" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.080363 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerName="init" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.080512 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerName="dnsmasq-dns" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.080523 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="05c9a5fb-e4e1-4d69-b790-2f5890b62aa8" containerName="neutron-db-sync" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.081422 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.092340 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-s6tln"] Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.105581 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5bfd8598c6-wqfkp"] Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.134231 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bfd8598c6-wqfkp"] Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.134348 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.138772 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cdq7v" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.138938 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.139113 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.139460 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.171588 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-config\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.171634 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.171666 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.171720 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls586\" (UniqueName: \"kubernetes.io/projected/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-kube-api-access-ls586\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.171748 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-swift-storage-0\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.171793 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-svc\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.217814 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" path="/var/lib/kubelet/pods/9b085aa0-d1ca-47a4-9b12-588dc9be67fe/volumes" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.255525 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d2a0f99-a606-4784-b6ca-9f6561d3cf93","Type":"ContainerStarted","Data":"5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492"} Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.255587 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d2a0f99-a606-4784-b6ca-9f6561d3cf93","Type":"ContainerStarted","Data":"9d5b7dc39e865c570c9cf076f2bab1395af4e14f4149d549f0743576100f6a26"} Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.258959 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsqds" event={"ID":"9c1a8234-533f-4cd5-9517-52acab86e99f","Type":"ContainerStarted","Data":"fd8acbbbdc07719b3fd9789e70de5b0c19e9336a0bf5e6a2e0e491ef32ac7c89"} Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.260680 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a8e3a3-716a-49ec-962e-25580e47f4a6","Type":"ContainerStarted","Data":"af73e056114a2a33a709e5c17ac2da30e70d25f0b9ec21b66200b676e70c1886"} Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.260711 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a8e3a3-716a-49ec-962e-25580e47f4a6","Type":"ContainerStarted","Data":"133bf0b5d01f0d7849f694f5fc6b409eeaa4a83d95037a88f2a221a5db609012"} Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.273027 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjsf\" (UniqueName: \"kubernetes.io/projected/89496eaf-50d1-45a6-802f-f127c3766a3b-kube-api-access-5cjsf\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.273072 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-swift-storage-0\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.276189 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-swift-storage-0\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.276481 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-httpd-config\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.276527 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-svc\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.276564 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-ovndb-tls-certs\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.276620 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-config\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.277892 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-config\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.277950 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.278015 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.278072 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-svc\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.278081 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-combined-ca-bundle\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.278235 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls586\" (UniqueName: \"kubernetes.io/projected/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-kube-api-access-ls586\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.278544 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-config\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.279169 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.279908 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.301198 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls586\" (UniqueName: \"kubernetes.io/projected/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-kube-api-access-ls586\") pod \"dnsmasq-dns-5d8b7b7d5-s6tln\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.325214 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rsqds" podStartSLOduration=11.325190335 podStartE2EDuration="11.325190335s" podCreationTimestamp="2026-03-18 18:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:38.280379476 +0000 UTC m=+1274.799852545" watchObservedRunningTime="2026-03-18 18:23:38.325190335 +0000 UTC m=+1274.844663424" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.379476 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-config\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.379617 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-combined-ca-bundle\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.379681 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjsf\" (UniqueName: \"kubernetes.io/projected/89496eaf-50d1-45a6-802f-f127c3766a3b-kube-api-access-5cjsf\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.379763 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-httpd-config\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.379784 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-ovndb-tls-certs\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.386388 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-combined-ca-bundle\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.386795 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-httpd-config\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.387518 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-ovndb-tls-certs\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.387810 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-config\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.398364 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjsf\" (UniqueName: \"kubernetes.io/projected/89496eaf-50d1-45a6-802f-f127c3766a3b-kube-api-access-5cjsf\") pod \"neutron-5bfd8598c6-wqfkp\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.455314 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:38 crc kubenswrapper[5008]: I0318 18:23:38.500715 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:39 crc kubenswrapper[5008]: I0318 18:23:39.298027 5008 generic.go:334] "Generic (PLEG): container finished" podID="906dac4b-4209-4b3b-b934-6804508c028b" containerID="691a4cffba40335c2f67b7e5844c2b3a8e32108a6a47bfa2c66e10562fd31321" exitCode=0 Mar 18 18:23:39 crc kubenswrapper[5008]: I0318 18:23:39.298325 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8x6nb" event={"ID":"906dac4b-4209-4b3b-b934-6804508c028b","Type":"ContainerDied","Data":"691a4cffba40335c2f67b7e5844c2b3a8e32108a6a47bfa2c66e10562fd31321"} Mar 18 18:23:39 crc kubenswrapper[5008]: I0318 18:23:39.306721 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a8e3a3-716a-49ec-962e-25580e47f4a6","Type":"ContainerStarted","Data":"02df5e9b1d3df7f9943325c9d6500ce870862dd110d6174a71739dbe225d9cb4"} Mar 18 18:23:39 crc kubenswrapper[5008]: I0318 18:23:39.319521 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d2a0f99-a606-4784-b6ca-9f6561d3cf93","Type":"ContainerStarted","Data":"4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec"} Mar 18 18:23:39 crc kubenswrapper[5008]: I0318 18:23:39.337173 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.33715124 podStartE2EDuration="13.33715124s" podCreationTimestamp="2026-03-18 18:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:39.332989556 +0000 UTC m=+1275.852462635" watchObservedRunningTime="2026-03-18 18:23:39.33715124 +0000 UTC m=+1275.856624319" Mar 18 18:23:39 crc kubenswrapper[5008]: I0318 18:23:39.364286 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.364267407 podStartE2EDuration="4.364267407s" podCreationTimestamp="2026-03-18 18:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:39.356530584 +0000 UTC m=+1275.876003663" watchObservedRunningTime="2026-03-18 18:23:39.364267407 +0000 UTC m=+1275.883740486" Mar 18 18:23:39 crc kubenswrapper[5008]: I0318 18:23:39.679792 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-s6tln"] Mar 18 18:23:39 crc kubenswrapper[5008]: I0318 18:23:39.894723 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bfd8598c6-wqfkp"] Mar 18 18:23:39 crc kubenswrapper[5008]: W0318 18:23:39.900458 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89496eaf_50d1_45a6_802f_f127c3766a3b.slice/crio-8161022d056d5e6c08cc080eaa78d9c23395cbc52b9355004d16352597140a47 WatchSource:0}: Error finding container 8161022d056d5e6c08cc080eaa78d9c23395cbc52b9355004d16352597140a47: Status 404 returned error can't find the container with id 8161022d056d5e6c08cc080eaa78d9c23395cbc52b9355004d16352597140a47 Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.327175 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerStarted","Data":"7808d4a8a291f22c3970c6e63bf684a08880d110b4cc300ab3fdba275f68472a"} Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.329744 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bfd8598c6-wqfkp" event={"ID":"89496eaf-50d1-45a6-802f-f127c3766a3b","Type":"ContainerStarted","Data":"db9b0e64ba2d9d784f3f83ded3ad1d7fc71e671670340db39bbbfe4e5d0921c0"} Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.329777 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bfd8598c6-wqfkp" event={"ID":"89496eaf-50d1-45a6-802f-f127c3766a3b","Type":"ContainerStarted","Data":"610e5ea90ac2ed7181ffc28f663e96834799eed0a2d6a49084894caa9dd208f6"} Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.329791 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bfd8598c6-wqfkp" event={"ID":"89496eaf-50d1-45a6-802f-f127c3766a3b","Type":"ContainerStarted","Data":"8161022d056d5e6c08cc080eaa78d9c23395cbc52b9355004d16352597140a47"} Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.330850 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.332094 5008 generic.go:334] "Generic (PLEG): container finished" podID="46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" containerID="fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1" exitCode=0 Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.332706 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" event={"ID":"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3","Type":"ContainerDied","Data":"fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1"} Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.332769 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" event={"ID":"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3","Type":"ContainerStarted","Data":"26778168550e71b340cf3d82a02a64909b4b55ca7faa02c7bca2238e35a4d77e"} Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.362033 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5bfd8598c6-wqfkp" podStartSLOduration=2.362008457 podStartE2EDuration="2.362008457s" podCreationTimestamp="2026-03-18 18:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:40.344195852 +0000 UTC m=+1276.863668931" watchObservedRunningTime="2026-03-18 18:23:40.362008457 +0000 UTC m=+1276.881481536" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.729627 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.774474 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c7c77574f-9xjzj"] Mar 18 18:23:40 crc kubenswrapper[5008]: E0318 18:23:40.774893 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906dac4b-4209-4b3b-b934-6804508c028b" containerName="placement-db-sync" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.774911 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="906dac4b-4209-4b3b-b934-6804508c028b" containerName="placement-db-sync" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.775076 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="906dac4b-4209-4b3b-b934-6804508c028b" containerName="placement-db-sync" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.775937 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.778669 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.778827 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.799913 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c7c77574f-9xjzj"] Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.828845 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906dac4b-4209-4b3b-b934-6804508c028b-logs\") pod \"906dac4b-4209-4b3b-b934-6804508c028b\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.828922 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-combined-ca-bundle\") pod \"906dac4b-4209-4b3b-b934-6804508c028b\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829086 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-config-data\") pod \"906dac4b-4209-4b3b-b934-6804508c028b\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829135 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7ntd\" (UniqueName: \"kubernetes.io/projected/906dac4b-4209-4b3b-b934-6804508c028b-kube-api-access-d7ntd\") pod \"906dac4b-4209-4b3b-b934-6804508c028b\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829194 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-scripts\") pod \"906dac4b-4209-4b3b-b934-6804508c028b\" (UID: \"906dac4b-4209-4b3b-b934-6804508c028b\") " Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829268 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906dac4b-4209-4b3b-b934-6804508c028b-logs" (OuterVolumeSpecName: "logs") pod "906dac4b-4209-4b3b-b934-6804508c028b" (UID: "906dac4b-4209-4b3b-b934-6804508c028b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829498 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-ovndb-tls-certs\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829540 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-config\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829625 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-httpd-config\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829662 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-internal-tls-certs\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829719 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-public-tls-certs\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829756 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-combined-ca-bundle\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829802 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kkml\" (UniqueName: \"kubernetes.io/projected/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-kube-api-access-4kkml\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.829857 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906dac4b-4209-4b3b-b934-6804508c028b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.854185 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906dac4b-4209-4b3b-b934-6804508c028b-kube-api-access-d7ntd" (OuterVolumeSpecName: "kube-api-access-d7ntd") pod "906dac4b-4209-4b3b-b934-6804508c028b" (UID: "906dac4b-4209-4b3b-b934-6804508c028b"). InnerVolumeSpecName "kube-api-access-d7ntd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.855091 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-scripts" (OuterVolumeSpecName: "scripts") pod "906dac4b-4209-4b3b-b934-6804508c028b" (UID: "906dac4b-4209-4b3b-b934-6804508c028b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.857882 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-config-data" (OuterVolumeSpecName: "config-data") pod "906dac4b-4209-4b3b-b934-6804508c028b" (UID: "906dac4b-4209-4b3b-b934-6804508c028b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.859903 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "906dac4b-4209-4b3b-b934-6804508c028b" (UID: "906dac4b-4209-4b3b-b934-6804508c028b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931356 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kkml\" (UniqueName: \"kubernetes.io/projected/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-kube-api-access-4kkml\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931415 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-ovndb-tls-certs\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931438 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-config\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931485 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-httpd-config\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931515 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-internal-tls-certs\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931570 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-public-tls-certs\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931599 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-combined-ca-bundle\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931640 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931652 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931662 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906dac4b-4209-4b3b-b934-6804508c028b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.931672 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7ntd\" (UniqueName: \"kubernetes.io/projected/906dac4b-4209-4b3b-b934-6804508c028b-kube-api-access-d7ntd\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.935255 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-httpd-config\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.936578 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-ovndb-tls-certs\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.937203 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-combined-ca-bundle\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.937490 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-config\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.938985 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-public-tls-certs\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.940230 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-internal-tls-certs\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:40 crc kubenswrapper[5008]: I0318 18:23:40.950583 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kkml\" (UniqueName: \"kubernetes.io/projected/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-kube-api-access-4kkml\") pod \"neutron-c7c77574f-9xjzj\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.092127 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.116176 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b4ddd5fb7-8xdxz" podUID="9b085aa0-d1ca-47a4-9b12-588dc9be67fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.359509 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" event={"ID":"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3","Type":"ContainerStarted","Data":"7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d"} Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.360722 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.373907 5008 generic.go:334] "Generic (PLEG): container finished" podID="9c1a8234-533f-4cd5-9517-52acab86e99f" containerID="fd8acbbbdc07719b3fd9789e70de5b0c19e9336a0bf5e6a2e0e491ef32ac7c89" exitCode=0 Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.374033 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsqds" event={"ID":"9c1a8234-533f-4cd5-9517-52acab86e99f","Type":"ContainerDied","Data":"fd8acbbbdc07719b3fd9789e70de5b0c19e9336a0bf5e6a2e0e491ef32ac7c89"} Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.378284 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8x6nb" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.378794 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8x6nb" event={"ID":"906dac4b-4209-4b3b-b934-6804508c028b","Type":"ContainerDied","Data":"47177af81029ecb448cd55ec70712c6959f33aa763b54c1c187d88d8faac0aea"} Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.378863 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47177af81029ecb448cd55ec70712c6959f33aa763b54c1c187d88d8faac0aea" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.399847 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" podStartSLOduration=3.399826369 podStartE2EDuration="3.399826369s" podCreationTimestamp="2026-03-18 18:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:41.382999909 +0000 UTC m=+1277.902472998" watchObservedRunningTime="2026-03-18 18:23:41.399826369 +0000 UTC m=+1277.919299448" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.473642 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b8cb95cb8-42zf7"] Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.475165 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.477039 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.477379 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dfntc" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.478839 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.483225 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.484475 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.509378 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b8cb95cb8-42zf7"] Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.558515 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-scripts\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.558580 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-config-data\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.558606 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-internal-tls-certs\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.558624 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-combined-ca-bundle\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.558659 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t59j9\" (UniqueName: \"kubernetes.io/projected/e558c4fe-4c62-49b6-bcbe-838e404d216c-kube-api-access-t59j9\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.558698 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e558c4fe-4c62-49b6-bcbe-838e404d216c-logs\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.558721 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-public-tls-certs\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.660465 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-internal-tls-certs\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.660508 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-combined-ca-bundle\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.660561 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t59j9\" (UniqueName: \"kubernetes.io/projected/e558c4fe-4c62-49b6-bcbe-838e404d216c-kube-api-access-t59j9\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.660605 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e558c4fe-4c62-49b6-bcbe-838e404d216c-logs\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.660631 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-public-tls-certs\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.660702 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-scripts\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.660718 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-config-data\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.669085 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e558c4fe-4c62-49b6-bcbe-838e404d216c-logs\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.673123 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-internal-tls-certs\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.673747 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-config-data\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.686963 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-scripts\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.696776 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t59j9\" (UniqueName: \"kubernetes.io/projected/e558c4fe-4c62-49b6-bcbe-838e404d216c-kube-api-access-t59j9\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.704216 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-public-tls-certs\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.704469 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-combined-ca-bundle\") pod \"placement-6b8cb95cb8-42zf7\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.745570 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c7c77574f-9xjzj"] Mar 18 18:23:41 crc kubenswrapper[5008]: I0318 18:23:41.807021 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:42 crc kubenswrapper[5008]: I0318 18:23:42.304354 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b8cb95cb8-42zf7"] Mar 18 18:23:42 crc kubenswrapper[5008]: W0318 18:23:42.314254 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode558c4fe_4c62_49b6_bcbe_838e404d216c.slice/crio-df6626fa38a969d5df6aa43626b0f88a0c97d093ff1ca0cec9f59025aaa6bcbe WatchSource:0}: Error finding container df6626fa38a969d5df6aa43626b0f88a0c97d093ff1ca0cec9f59025aaa6bcbe: Status 404 returned error can't find the container with id df6626fa38a969d5df6aa43626b0f88a0c97d093ff1ca0cec9f59025aaa6bcbe Mar 18 18:23:42 crc kubenswrapper[5008]: I0318 18:23:42.399795 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7c77574f-9xjzj" event={"ID":"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d","Type":"ContainerStarted","Data":"9c613ccb6c89ec32c5078a70b44fae2c00bb3b09aa99e018dbbd081bec4cccf3"} Mar 18 18:23:42 crc kubenswrapper[5008]: I0318 18:23:42.399854 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7c77574f-9xjzj" event={"ID":"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d","Type":"ContainerStarted","Data":"d8321e3a7df103240068788e10312c916b5bd1e5358b1b6141e98b2af6e2420b"} Mar 18 18:23:42 crc kubenswrapper[5008]: I0318 18:23:42.399867 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7c77574f-9xjzj" event={"ID":"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d","Type":"ContainerStarted","Data":"b1a0743d568bf4af006d3eba3792f5fbd51a4e0d597f352cc0fb534f9ad43132"} Mar 18 18:23:42 crc kubenswrapper[5008]: I0318 18:23:42.399924 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:23:42 crc kubenswrapper[5008]: I0318 18:23:42.404191 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b8cb95cb8-42zf7" event={"ID":"e558c4fe-4c62-49b6-bcbe-838e404d216c","Type":"ContainerStarted","Data":"df6626fa38a969d5df6aa43626b0f88a0c97d093ff1ca0cec9f59025aaa6bcbe"} Mar 18 18:23:42 crc kubenswrapper[5008]: I0318 18:23:42.429680 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c7c77574f-9xjzj" podStartSLOduration=2.42965651 podStartE2EDuration="2.42965651s" podCreationTimestamp="2026-03-18 18:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:42.419013334 +0000 UTC m=+1278.938486413" watchObservedRunningTime="2026-03-18 18:23:42.42965651 +0000 UTC m=+1278.949129589" Mar 18 18:23:42 crc kubenswrapper[5008]: I0318 18:23:42.868330 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.034090 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-credential-keys\") pod \"9c1a8234-533f-4cd5-9517-52acab86e99f\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.034154 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsnjl\" (UniqueName: \"kubernetes.io/projected/9c1a8234-533f-4cd5-9517-52acab86e99f-kube-api-access-lsnjl\") pod \"9c1a8234-533f-4cd5-9517-52acab86e99f\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.034180 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-config-data\") pod \"9c1a8234-533f-4cd5-9517-52acab86e99f\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.034239 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-fernet-keys\") pod \"9c1a8234-533f-4cd5-9517-52acab86e99f\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.034313 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-scripts\") pod \"9c1a8234-533f-4cd5-9517-52acab86e99f\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.034330 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-combined-ca-bundle\") pod \"9c1a8234-533f-4cd5-9517-52acab86e99f\" (UID: \"9c1a8234-533f-4cd5-9517-52acab86e99f\") " Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.040152 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9c1a8234-533f-4cd5-9517-52acab86e99f" (UID: "9c1a8234-533f-4cd5-9517-52acab86e99f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.040629 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1a8234-533f-4cd5-9517-52acab86e99f-kube-api-access-lsnjl" (OuterVolumeSpecName: "kube-api-access-lsnjl") pod "9c1a8234-533f-4cd5-9517-52acab86e99f" (UID: "9c1a8234-533f-4cd5-9517-52acab86e99f"). InnerVolumeSpecName "kube-api-access-lsnjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.040690 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-scripts" (OuterVolumeSpecName: "scripts") pod "9c1a8234-533f-4cd5-9517-52acab86e99f" (UID: "9c1a8234-533f-4cd5-9517-52acab86e99f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.041221 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9c1a8234-533f-4cd5-9517-52acab86e99f" (UID: "9c1a8234-533f-4cd5-9517-52acab86e99f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.071659 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-config-data" (OuterVolumeSpecName: "config-data") pod "9c1a8234-533f-4cd5-9517-52acab86e99f" (UID: "9c1a8234-533f-4cd5-9517-52acab86e99f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.071776 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c1a8234-533f-4cd5-9517-52acab86e99f" (UID: "9c1a8234-533f-4cd5-9517-52acab86e99f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.136390 5008 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.136637 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.136726 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.136808 5008 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.136885 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsnjl\" (UniqueName: \"kubernetes.io/projected/9c1a8234-533f-4cd5-9517-52acab86e99f-kube-api-access-lsnjl\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.136967 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1a8234-533f-4cd5-9517-52acab86e99f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.423546 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b8cb95cb8-42zf7" event={"ID":"e558c4fe-4c62-49b6-bcbe-838e404d216c","Type":"ContainerStarted","Data":"049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7"} Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.423607 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b8cb95cb8-42zf7" event={"ID":"e558c4fe-4c62-49b6-bcbe-838e404d216c","Type":"ContainerStarted","Data":"937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf"} Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.423652 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.423698 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.440467 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsqds" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.445701 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsqds" event={"ID":"9c1a8234-533f-4cd5-9517-52acab86e99f","Type":"ContainerDied","Data":"e90ae921b04a6d778ad146df03b17c4e9258fa5fd28614f5f99805e62e7e5721"} Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.445756 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e90ae921b04a6d778ad146df03b17c4e9258fa5fd28614f5f99805e62e7e5721" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.453360 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b8cb95cb8-42zf7" podStartSLOduration=2.453344508 podStartE2EDuration="2.453344508s" podCreationTimestamp="2026-03-18 18:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:43.445979814 +0000 UTC m=+1279.965452893" watchObservedRunningTime="2026-03-18 18:23:43.453344508 +0000 UTC m=+1279.972817587" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.553758 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7d8b8459c4-2mq5n"] Mar 18 18:23:43 crc kubenswrapper[5008]: E0318 18:23:43.554223 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1a8234-533f-4cd5-9517-52acab86e99f" containerName="keystone-bootstrap" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.554247 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1a8234-533f-4cd5-9517-52acab86e99f" containerName="keystone-bootstrap" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.554468 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1a8234-533f-4cd5-9517-52acab86e99f" containerName="keystone-bootstrap" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.555151 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.558451 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-87mjc" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.558688 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.558756 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.558794 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.559021 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.559171 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.577033 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d8b8459c4-2mq5n"] Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.657595 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-config-data\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.657661 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-combined-ca-bundle\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.657686 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-fernet-keys\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.657756 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-internal-tls-certs\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.657918 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-scripts\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.657995 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brd8q\" (UniqueName: \"kubernetes.io/projected/16314cf6-663f-4fa9-a1e7-272c1a183b58-kube-api-access-brd8q\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.658051 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-public-tls-certs\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.658070 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-credential-keys\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.759966 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-config-data\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.760045 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-combined-ca-bundle\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.760088 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-fernet-keys\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.760167 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-internal-tls-certs\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.760207 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-scripts\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.760254 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brd8q\" (UniqueName: \"kubernetes.io/projected/16314cf6-663f-4fa9-a1e7-272c1a183b58-kube-api-access-brd8q\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.760284 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-public-tls-certs\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.760303 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-credential-keys\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.768203 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-credential-keys\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.769694 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-fernet-keys\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.770085 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-combined-ca-bundle\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.771366 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-scripts\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.772058 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-internal-tls-certs\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.773022 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-public-tls-certs\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.773394 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-config-data\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.786313 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brd8q\" (UniqueName: \"kubernetes.io/projected/16314cf6-663f-4fa9-a1e7-272c1a183b58-kube-api-access-brd8q\") pod \"keystone-7d8b8459c4-2mq5n\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:43 crc kubenswrapper[5008]: I0318 18:23:43.899199 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:44 crc kubenswrapper[5008]: I0318 18:23:44.354421 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d8b8459c4-2mq5n"] Mar 18 18:23:44 crc kubenswrapper[5008]: I0318 18:23:44.452437 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d8b8459c4-2mq5n" event={"ID":"16314cf6-663f-4fa9-a1e7-272c1a183b58","Type":"ContainerStarted","Data":"8186640c7e91af798f46b247e00dbc88fe015ca6aad2b1642f9b8cdf8f15de9f"} Mar 18 18:23:45 crc kubenswrapper[5008]: I0318 18:23:45.589763 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:45 crc kubenswrapper[5008]: I0318 18:23:45.590351 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:45 crc kubenswrapper[5008]: I0318 18:23:45.630643 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:45 crc kubenswrapper[5008]: I0318 18:23:45.641600 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:46 crc kubenswrapper[5008]: I0318 18:23:46.486696 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:46 crc kubenswrapper[5008]: I0318 18:23:46.486751 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:46 crc kubenswrapper[5008]: I0318 18:23:46.704413 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 18:23:46 crc kubenswrapper[5008]: I0318 18:23:46.704984 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 18:23:46 crc kubenswrapper[5008]: I0318 18:23:46.742424 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 18:23:46 crc kubenswrapper[5008]: I0318 18:23:46.758487 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 18:23:47 crc kubenswrapper[5008]: I0318 18:23:47.497300 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d8b8459c4-2mq5n" event={"ID":"16314cf6-663f-4fa9-a1e7-272c1a183b58","Type":"ContainerStarted","Data":"5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7"} Mar 18 18:23:47 crc kubenswrapper[5008]: I0318 18:23:47.497601 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:23:47 crc kubenswrapper[5008]: I0318 18:23:47.501172 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerStarted","Data":"d6455c2ff064befd66069965986d13aba7a8fe9b2ee23a975978a103770cd133"} Mar 18 18:23:47 crc kubenswrapper[5008]: I0318 18:23:47.501581 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 18:23:47 crc kubenswrapper[5008]: I0318 18:23:47.501617 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 18:23:47 crc kubenswrapper[5008]: I0318 18:23:47.536077 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7d8b8459c4-2mq5n" podStartSLOduration=4.53605565 podStartE2EDuration="4.53605565s" podCreationTimestamp="2026-03-18 18:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:23:47.522199423 +0000 UTC m=+1284.041672502" watchObservedRunningTime="2026-03-18 18:23:47.53605565 +0000 UTC m=+1284.055528739" Mar 18 18:23:48 crc kubenswrapper[5008]: I0318 18:23:48.457695 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:23:48 crc kubenswrapper[5008]: I0318 18:23:48.514205 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jgqxf" event={"ID":"315d5f4a-5139-47d4-8aaf-c3088d6eae91","Type":"ContainerStarted","Data":"0467eb604fa1a269513b52ef89a127c20e5e7cac19e0e4d6ff17d11d5afda14e"} Mar 18 18:23:48 crc kubenswrapper[5008]: I0318 18:23:48.554834 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-xdlt2"] Mar 18 18:23:48 crc kubenswrapper[5008]: I0318 18:23:48.555070 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" podUID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" containerName="dnsmasq-dns" containerID="cri-o://43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276" gracePeriod=10 Mar 18 18:23:48 crc kubenswrapper[5008]: I0318 18:23:48.666088 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:48 crc kubenswrapper[5008]: I0318 18:23:48.666178 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 18:23:48 crc kubenswrapper[5008]: I0318 18:23:48.715851 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jgqxf" podStartSLOduration=2.610084789 podStartE2EDuration="35.715834268s" podCreationTimestamp="2026-03-18 18:23:13 +0000 UTC" firstStartedPulling="2026-03-18 18:23:14.638737599 +0000 UTC m=+1251.158210688" lastFinishedPulling="2026-03-18 18:23:47.744487078 +0000 UTC m=+1284.263960167" observedRunningTime="2026-03-18 18:23:48.560088457 +0000 UTC m=+1285.079561536" watchObservedRunningTime="2026-03-18 18:23:48.715834268 +0000 UTC m=+1285.235307347" Mar 18 18:23:48 crc kubenswrapper[5008]: I0318 18:23:48.982792 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.111269 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.156022 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjzlg\" (UniqueName: \"kubernetes.io/projected/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-kube-api-access-jjzlg\") pod \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.156068 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-config\") pod \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.156111 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-svc\") pod \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.156171 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-sb\") pod \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.156242 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-nb\") pod \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.156278 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-swift-storage-0\") pod \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\" (UID: \"f66df63f-e2b6-4f43-b3c0-9ebb157b0693\") " Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.162629 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-kube-api-access-jjzlg" (OuterVolumeSpecName: "kube-api-access-jjzlg") pod "f66df63f-e2b6-4f43-b3c0-9ebb157b0693" (UID: "f66df63f-e2b6-4f43-b3c0-9ebb157b0693"). InnerVolumeSpecName "kube-api-access-jjzlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.245980 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f66df63f-e2b6-4f43-b3c0-9ebb157b0693" (UID: "f66df63f-e2b6-4f43-b3c0-9ebb157b0693"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.247788 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f66df63f-e2b6-4f43-b3c0-9ebb157b0693" (UID: "f66df63f-e2b6-4f43-b3c0-9ebb157b0693"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.255017 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-config" (OuterVolumeSpecName: "config") pod "f66df63f-e2b6-4f43-b3c0-9ebb157b0693" (UID: "f66df63f-e2b6-4f43-b3c0-9ebb157b0693"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.258635 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.258662 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.258672 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjzlg\" (UniqueName: \"kubernetes.io/projected/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-kube-api-access-jjzlg\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.258681 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.261476 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f66df63f-e2b6-4f43-b3c0-9ebb157b0693" (UID: "f66df63f-e2b6-4f43-b3c0-9ebb157b0693"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.262067 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f66df63f-e2b6-4f43-b3c0-9ebb157b0693" (UID: "f66df63f-e2b6-4f43-b3c0-9ebb157b0693"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.363039 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.363084 5008 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f66df63f-e2b6-4f43-b3c0-9ebb157b0693-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.527698 5008 generic.go:334] "Generic (PLEG): container finished" podID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" containerID="43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276" exitCode=0 Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.527773 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.527798 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" event={"ID":"f66df63f-e2b6-4f43-b3c0-9ebb157b0693","Type":"ContainerDied","Data":"43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276"} Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.528156 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" event={"ID":"f66df63f-e2b6-4f43-b3c0-9ebb157b0693","Type":"ContainerDied","Data":"440e155fcc78b52bc476f90e706a59717c932602bb9edc57eaf75db097115d05"} Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.528205 5008 scope.go:117] "RemoveContainer" containerID="43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.582220 5008 scope.go:117] "RemoveContainer" containerID="80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.595888 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-xdlt2"] Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.604018 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7fb48775-xdlt2"] Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.616780 5008 scope.go:117] "RemoveContainer" containerID="43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276" Mar 18 18:23:49 crc kubenswrapper[5008]: E0318 18:23:49.618735 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276\": container with ID starting with 43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276 not found: ID does not exist" containerID="43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.618777 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276"} err="failed to get container status \"43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276\": rpc error: code = NotFound desc = could not find container \"43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276\": container with ID starting with 43cfa593601e48407235506732eba9f5732b9c13fe7595b50879ee082d29c276 not found: ID does not exist" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.618804 5008 scope.go:117] "RemoveContainer" containerID="80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073" Mar 18 18:23:49 crc kubenswrapper[5008]: E0318 18:23:49.620190 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073\": container with ID starting with 80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073 not found: ID does not exist" containerID="80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073" Mar 18 18:23:49 crc kubenswrapper[5008]: I0318 18:23:49.620300 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073"} err="failed to get container status \"80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073\": rpc error: code = NotFound desc = could not find container \"80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073\": container with ID starting with 80d2a947f294713c07eea3219b9344b3577bc4f8697c02a1a57e93289ca68073 not found: ID does not exist" Mar 18 18:23:50 crc kubenswrapper[5008]: I0318 18:23:50.025697 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 18:23:50 crc kubenswrapper[5008]: I0318 18:23:50.025820 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 18:23:50 crc kubenswrapper[5008]: I0318 18:23:50.212077 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" path="/var/lib/kubelet/pods/f66df63f-e2b6-4f43-b3c0-9ebb157b0693/volumes" Mar 18 18:23:50 crc kubenswrapper[5008]: I0318 18:23:50.248967 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 18:23:52 crc kubenswrapper[5008]: I0318 18:23:52.558313 5008 generic.go:334] "Generic (PLEG): container finished" podID="315d5f4a-5139-47d4-8aaf-c3088d6eae91" containerID="0467eb604fa1a269513b52ef89a127c20e5e7cac19e0e4d6ff17d11d5afda14e" exitCode=0 Mar 18 18:23:52 crc kubenswrapper[5008]: I0318 18:23:52.558420 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jgqxf" event={"ID":"315d5f4a-5139-47d4-8aaf-c3088d6eae91","Type":"ContainerDied","Data":"0467eb604fa1a269513b52ef89a127c20e5e7cac19e0e4d6ff17d11d5afda14e"} Mar 18 18:23:53 crc kubenswrapper[5008]: I0318 18:23:53.905414 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d7fb48775-xdlt2" podUID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: i/o timeout" Mar 18 18:23:53 crc kubenswrapper[5008]: I0318 18:23:53.970287 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.083122 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-db-sync-config-data\") pod \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.084130 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ql95\" (UniqueName: \"kubernetes.io/projected/315d5f4a-5139-47d4-8aaf-c3088d6eae91-kube-api-access-9ql95\") pod \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.084165 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-combined-ca-bundle\") pod \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\" (UID: \"315d5f4a-5139-47d4-8aaf-c3088d6eae91\") " Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.107770 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "315d5f4a-5139-47d4-8aaf-c3088d6eae91" (UID: "315d5f4a-5139-47d4-8aaf-c3088d6eae91"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.107816 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315d5f4a-5139-47d4-8aaf-c3088d6eae91-kube-api-access-9ql95" (OuterVolumeSpecName: "kube-api-access-9ql95") pod "315d5f4a-5139-47d4-8aaf-c3088d6eae91" (UID: "315d5f4a-5139-47d4-8aaf-c3088d6eae91"). InnerVolumeSpecName "kube-api-access-9ql95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.122689 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "315d5f4a-5139-47d4-8aaf-c3088d6eae91" (UID: "315d5f4a-5139-47d4-8aaf-c3088d6eae91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.186340 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ql95\" (UniqueName: \"kubernetes.io/projected/315d5f4a-5139-47d4-8aaf-c3088d6eae91-kube-api-access-9ql95\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.186368 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.186377 5008 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/315d5f4a-5139-47d4-8aaf-c3088d6eae91-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.578190 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jgqxf" event={"ID":"315d5f4a-5139-47d4-8aaf-c3088d6eae91","Type":"ContainerDied","Data":"978cd20e9a75a20d10cef9778acfffef3686b1fb723fb13e63b33c995f3c6272"} Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.578227 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="978cd20e9a75a20d10cef9778acfffef3686b1fb723fb13e63b33c995f3c6272" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.578279 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jgqxf" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.787198 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-ff487fff5-mqmcg"] Mar 18 18:23:54 crc kubenswrapper[5008]: E0318 18:23:54.787747 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315d5f4a-5139-47d4-8aaf-c3088d6eae91" containerName="barbican-db-sync" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.787776 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="315d5f4a-5139-47d4-8aaf-c3088d6eae91" containerName="barbican-db-sync" Mar 18 18:23:54 crc kubenswrapper[5008]: E0318 18:23:54.787804 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" containerName="dnsmasq-dns" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.787815 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" containerName="dnsmasq-dns" Mar 18 18:23:54 crc kubenswrapper[5008]: E0318 18:23:54.787870 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" containerName="init" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.787883 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" containerName="init" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.788158 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="315d5f4a-5139-47d4-8aaf-c3088d6eae91" containerName="barbican-db-sync" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.788245 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66df63f-e2b6-4f43-b3c0-9ebb157b0693" containerName="dnsmasq-dns" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.790493 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.802785 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4xdkr" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.803045 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.803221 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.811465 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-ff487fff5-mqmcg"] Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.830814 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-66b589877b-qzcdx"] Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.832796 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.835906 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.869312 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-66b589877b-qzcdx"] Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.910544 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.910935 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a03e07-237e-4583-81b4-8d9aadc76ea3-logs\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.910998 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-combined-ca-bundle\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.911282 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7dv\" (UniqueName: \"kubernetes.io/projected/24a03e07-237e-4583-81b4-8d9aadc76ea3-kube-api-access-zm7dv\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.911414 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data-custom\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.911466 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-combined-ca-bundle\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.911682 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqttq\" (UniqueName: \"kubernetes.io/projected/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-kube-api-access-zqttq\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.911735 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.911762 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data-custom\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.911796 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-logs\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.925803 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-kvf5t"] Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.928363 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:54 crc kubenswrapper[5008]: I0318 18:23:54.941576 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-kvf5t"] Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013534 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-swift-storage-0\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013606 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrckn\" (UniqueName: \"kubernetes.io/projected/9dab7b89-40a0-4059-b062-6043e4e240b9-kube-api-access-qrckn\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013654 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-config\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013699 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7dv\" (UniqueName: \"kubernetes.io/projected/24a03e07-237e-4583-81b4-8d9aadc76ea3-kube-api-access-zm7dv\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013747 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data-custom\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013782 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-combined-ca-bundle\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013833 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqttq\" (UniqueName: \"kubernetes.io/projected/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-kube-api-access-zqttq\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013864 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013927 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data-custom\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013957 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-nb\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013979 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-logs\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.013999 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.014023 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a03e07-237e-4583-81b4-8d9aadc76ea3-logs\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.014058 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-svc\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.014074 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-sb\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.014098 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-combined-ca-bundle\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.016191 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a03e07-237e-4583-81b4-8d9aadc76ea3-logs\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.019968 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data-custom\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.020547 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-logs\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.026808 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-combined-ca-bundle\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.028442 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-combined-ca-bundle\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.037048 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.038841 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.054248 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqttq\" (UniqueName: \"kubernetes.io/projected/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-kube-api-access-zqttq\") pod \"barbican-worker-ff487fff5-mqmcg\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.057247 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7dv\" (UniqueName: \"kubernetes.io/projected/24a03e07-237e-4583-81b4-8d9aadc76ea3-kube-api-access-zm7dv\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.064264 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data-custom\") pod \"barbican-keystone-listener-66b589877b-qzcdx\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.114034 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.115772 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-nb\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.116680 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-nb\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.116835 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-svc\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.116853 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-sb\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.117402 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-svc\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.117811 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-sb\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.118370 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-swift-storage-0\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.118422 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrckn\" (UniqueName: \"kubernetes.io/projected/9dab7b89-40a0-4059-b062-6043e4e240b9-kube-api-access-qrckn\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.118498 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-config\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.119433 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-swift-storage-0\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.119446 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-config\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.159139 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d95565c88-9t6tp"] Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.160432 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.164306 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.167175 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.178605 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrckn\" (UniqueName: \"kubernetes.io/projected/9dab7b89-40a0-4059-b062-6043e4e240b9-kube-api-access-qrckn\") pod \"dnsmasq-dns-7df4c9958f-kvf5t\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.223492 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.223573 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e82b67-9249-4e21-8da6-3138fddcff0e-logs\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.223623 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data-custom\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.223653 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7g4g\" (UniqueName: \"kubernetes.io/projected/56e82b67-9249-4e21-8da6-3138fddcff0e-kube-api-access-l7g4g\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.223699 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-combined-ca-bundle\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.232006 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d95565c88-9t6tp"] Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.257868 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.325082 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-combined-ca-bundle\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.325213 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.325305 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e82b67-9249-4e21-8da6-3138fddcff0e-logs\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.325382 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data-custom\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.325428 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7g4g\" (UniqueName: \"kubernetes.io/projected/56e82b67-9249-4e21-8da6-3138fddcff0e-kube-api-access-l7g4g\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.326099 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e82b67-9249-4e21-8da6-3138fddcff0e-logs\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.331620 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-combined-ca-bundle\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.336345 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data-custom\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.341832 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.350360 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7g4g\" (UniqueName: \"kubernetes.io/projected/56e82b67-9249-4e21-8da6-3138fddcff0e-kube-api-access-l7g4g\") pod \"barbican-api-7d95565c88-9t6tp\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:55 crc kubenswrapper[5008]: I0318 18:23:55.558135 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.771102 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-77598b888d-8wwqt"] Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.773728 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.775427 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.775637 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.795660 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77598b888d-8wwqt"] Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.896712 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-combined-ca-bundle\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.896907 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj4gh\" (UniqueName: \"kubernetes.io/projected/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-kube-api-access-zj4gh\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.897035 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data-custom\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.897078 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-logs\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.897238 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.897425 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-internal-tls-certs\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.897472 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-public-tls-certs\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.999464 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.999619 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-internal-tls-certs\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.999714 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-public-tls-certs\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:58 crc kubenswrapper[5008]: I0318 18:23:58.999861 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-combined-ca-bundle\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.000017 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj4gh\" (UniqueName: \"kubernetes.io/projected/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-kube-api-access-zj4gh\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.000136 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data-custom\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.000200 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-logs\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.001043 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-logs\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.005803 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data-custom\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.007437 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.008116 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-internal-tls-certs\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.008730 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-public-tls-certs\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.009446 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-combined-ca-bundle\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.022791 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj4gh\" (UniqueName: \"kubernetes.io/projected/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-kube-api-access-zj4gh\") pod \"barbican-api-77598b888d-8wwqt\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:23:59 crc kubenswrapper[5008]: I0318 18:23:59.100531 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.179070 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564304-d5g9j"] Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.181600 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-d5g9j" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.186491 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-d5g9j"] Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.187545 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.187887 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.188028 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.229389 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgxvq\" (UniqueName: \"kubernetes.io/projected/117e64db-91f2-46f2-872e-2edba77b07d9-kube-api-access-hgxvq\") pod \"auto-csr-approver-29564304-d5g9j\" (UID: \"117e64db-91f2-46f2-872e-2edba77b07d9\") " pod="openshift-infra/auto-csr-approver-29564304-d5g9j" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.331421 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgxvq\" (UniqueName: \"kubernetes.io/projected/117e64db-91f2-46f2-872e-2edba77b07d9-kube-api-access-hgxvq\") pod \"auto-csr-approver-29564304-d5g9j\" (UID: \"117e64db-91f2-46f2-872e-2edba77b07d9\") " pod="openshift-infra/auto-csr-approver-29564304-d5g9j" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.335770 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d95565c88-9t6tp"] Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.354366 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgxvq\" (UniqueName: \"kubernetes.io/projected/117e64db-91f2-46f2-872e-2edba77b07d9-kube-api-access-hgxvq\") pod \"auto-csr-approver-29564304-d5g9j\" (UID: \"117e64db-91f2-46f2-872e-2edba77b07d9\") " pod="openshift-infra/auto-csr-approver-29564304-d5g9j" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.442954 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-66b589877b-qzcdx"] Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.524597 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-d5g9j" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.538779 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-ff487fff5-mqmcg"] Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.553504 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-kvf5t"] Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.563245 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77598b888d-8wwqt"] Mar 18 18:24:00 crc kubenswrapper[5008]: W0318 18:24:00.575647 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd67f3431_0e44_4d3c_8aa9_0f3fb176387d.slice/crio-46bc49800b92d1e06efc1b3f7c4a463b3c5edbbf3daea285cb0ecc49e6cb5b1e WatchSource:0}: Error finding container 46bc49800b92d1e06efc1b3f7c4a463b3c5edbbf3daea285cb0ecc49e6cb5b1e: Status 404 returned error can't find the container with id 46bc49800b92d1e06efc1b3f7c4a463b3c5edbbf3daea285cb0ecc49e6cb5b1e Mar 18 18:24:00 crc kubenswrapper[5008]: W0318 18:24:00.577588 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dab7b89_40a0_4059_b062_6043e4e240b9.slice/crio-009ec75666c6da057e191ef82448f1a849d5c063ad77e8f2bcd0e160c4111dc2 WatchSource:0}: Error finding container 009ec75666c6da057e191ef82448f1a849d5c063ad77e8f2bcd0e160c4111dc2: Status 404 returned error can't find the container with id 009ec75666c6da057e191ef82448f1a849d5c063ad77e8f2bcd0e160c4111dc2 Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.661656 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77598b888d-8wwqt" event={"ID":"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad","Type":"ContainerStarted","Data":"65429ac5f7e421636d76dc0b135212d8027d2f152d47af343cb9b073d37b1fa1"} Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.668777 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" event={"ID":"24a03e07-237e-4583-81b4-8d9aadc76ea3","Type":"ContainerStarted","Data":"1a6c6e2676c9a628e5d3bbfb8b2283bba22da47044a87021aebcbf9ea7d8b8f2"} Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.670977 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-ff487fff5-mqmcg" event={"ID":"d67f3431-0e44-4d3c-8aa9-0f3fb176387d","Type":"ContainerStarted","Data":"46bc49800b92d1e06efc1b3f7c4a463b3c5edbbf3daea285cb0ecc49e6cb5b1e"} Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.685607 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerStarted","Data":"078d226d1804ff7c42c51e4c69d33a168d917fabc1c607596e4d259a6fe1966f"} Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.685856 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="ceilometer-central-agent" containerID="cri-o://a00d2e33d5a611eb348a15725223f5f013cb2fd357ace420a3b26bc02145ca80" gracePeriod=30 Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.686075 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.686205 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="proxy-httpd" containerID="cri-o://078d226d1804ff7c42c51e4c69d33a168d917fabc1c607596e4d259a6fe1966f" gracePeriod=30 Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.686387 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="ceilometer-notification-agent" containerID="cri-o://7808d4a8a291f22c3970c6e63bf684a08880d110b4cc300ab3fdba275f68472a" gracePeriod=30 Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.686438 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="sg-core" containerID="cri-o://d6455c2ff064befd66069965986d13aba7a8fe9b2ee23a975978a103770cd133" gracePeriod=30 Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.701466 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" event={"ID":"9dab7b89-40a0-4059-b062-6043e4e240b9","Type":"ContainerStarted","Data":"009ec75666c6da057e191ef82448f1a849d5c063ad77e8f2bcd0e160c4111dc2"} Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.702673 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d95565c88-9t6tp" event={"ID":"56e82b67-9249-4e21-8da6-3138fddcff0e","Type":"ContainerStarted","Data":"53ee03fba85a231bc7f7757dc77a8584100d0d7d249d78bce5743c4bfdb7f845"} Mar 18 18:24:00 crc kubenswrapper[5008]: I0318 18:24:00.715158 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.933967643 podStartE2EDuration="47.715137654s" podCreationTimestamp="2026-03-18 18:23:13 +0000 UTC" firstStartedPulling="2026-03-18 18:23:14.153928645 +0000 UTC m=+1250.673401714" lastFinishedPulling="2026-03-18 18:23:59.935098646 +0000 UTC m=+1296.454571725" observedRunningTime="2026-03-18 18:24:00.714948769 +0000 UTC m=+1297.234421848" watchObservedRunningTime="2026-03-18 18:24:00.715137654 +0000 UTC m=+1297.234610733" Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.117102 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-d5g9j"] Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.720359 5008 generic.go:334] "Generic (PLEG): container finished" podID="9dab7b89-40a0-4059-b062-6043e4e240b9" containerID="845927cd51cbb95f2e8ece82288539f3e9978b858292c645a99783cab6667fdc" exitCode=0 Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.720451 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" event={"ID":"9dab7b89-40a0-4059-b062-6043e4e240b9","Type":"ContainerDied","Data":"845927cd51cbb95f2e8ece82288539f3e9978b858292c645a99783cab6667fdc"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.729639 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d95565c88-9t6tp" event={"ID":"56e82b67-9249-4e21-8da6-3138fddcff0e","Type":"ContainerStarted","Data":"4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.729694 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d95565c88-9t6tp" event={"ID":"56e82b67-9249-4e21-8da6-3138fddcff0e","Type":"ContainerStarted","Data":"aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.730345 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.730378 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.733364 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77598b888d-8wwqt" event={"ID":"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad","Type":"ContainerStarted","Data":"4eadc2692f3a8d44b36203cb5d3923e2c3a539a3f5ba5c413d2a6377e7749375"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.733387 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77598b888d-8wwqt" event={"ID":"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad","Type":"ContainerStarted","Data":"e741fbe7654b140e43fc28460cedf85595febc32bb65a05c9b3ffb75588298c2"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.733703 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.733746 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.736090 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jbx2h" event={"ID":"bed54cd2-a411-4362-a7b1-7fab16ba8b6b","Type":"ContainerStarted","Data":"a7ce89506274663821070b5c6dad5f8257699a1427ed13a75372ea3757a9a56c"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.750927 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-d5g9j" event={"ID":"117e64db-91f2-46f2-872e-2edba77b07d9","Type":"ContainerStarted","Data":"490feae64d61603429fe23dd56b207a3635c34548e1b8acda3a441ad424941f5"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.759678 5008 generic.go:334] "Generic (PLEG): container finished" podID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerID="078d226d1804ff7c42c51e4c69d33a168d917fabc1c607596e4d259a6fe1966f" exitCode=0 Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.759715 5008 generic.go:334] "Generic (PLEG): container finished" podID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerID="d6455c2ff064befd66069965986d13aba7a8fe9b2ee23a975978a103770cd133" exitCode=2 Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.759725 5008 generic.go:334] "Generic (PLEG): container finished" podID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerID="a00d2e33d5a611eb348a15725223f5f013cb2fd357ace420a3b26bc02145ca80" exitCode=0 Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.759772 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerDied","Data":"078d226d1804ff7c42c51e4c69d33a168d917fabc1c607596e4d259a6fe1966f"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.759854 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerDied","Data":"d6455c2ff064befd66069965986d13aba7a8fe9b2ee23a975978a103770cd133"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.759875 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerDied","Data":"a00d2e33d5a611eb348a15725223f5f013cb2fd357ace420a3b26bc02145ca80"} Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.774790 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-77598b888d-8wwqt" podStartSLOduration=3.774769364 podStartE2EDuration="3.774769364s" podCreationTimestamp="2026-03-18 18:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:01.76014412 +0000 UTC m=+1298.279617199" watchObservedRunningTime="2026-03-18 18:24:01.774769364 +0000 UTC m=+1298.294242443" Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.792717 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jbx2h" podStartSLOduration=3.288488534 podStartE2EDuration="48.792695636s" podCreationTimestamp="2026-03-18 18:23:13 +0000 UTC" firstStartedPulling="2026-03-18 18:23:14.355455611 +0000 UTC m=+1250.874928690" lastFinishedPulling="2026-03-18 18:23:59.859662703 +0000 UTC m=+1296.379135792" observedRunningTime="2026-03-18 18:24:01.783201026 +0000 UTC m=+1298.302674105" watchObservedRunningTime="2026-03-18 18:24:01.792695636 +0000 UTC m=+1298.312168725" Mar 18 18:24:01 crc kubenswrapper[5008]: I0318 18:24:01.804683 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d95565c88-9t6tp" podStartSLOduration=6.80466373 podStartE2EDuration="6.80466373s" podCreationTimestamp="2026-03-18 18:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:01.804250999 +0000 UTC m=+1298.323724078" watchObservedRunningTime="2026-03-18 18:24:01.80466373 +0000 UTC m=+1298.324136809" Mar 18 18:24:02 crc kubenswrapper[5008]: I0318 18:24:02.780479 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-ff487fff5-mqmcg" event={"ID":"d67f3431-0e44-4d3c-8aa9-0f3fb176387d","Type":"ContainerStarted","Data":"7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951"} Mar 18 18:24:02 crc kubenswrapper[5008]: I0318 18:24:02.781329 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-ff487fff5-mqmcg" event={"ID":"d67f3431-0e44-4d3c-8aa9-0f3fb176387d","Type":"ContainerStarted","Data":"f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59"} Mar 18 18:24:02 crc kubenswrapper[5008]: I0318 18:24:02.788081 5008 generic.go:334] "Generic (PLEG): container finished" podID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerID="7808d4a8a291f22c3970c6e63bf684a08880d110b4cc300ab3fdba275f68472a" exitCode=0 Mar 18 18:24:02 crc kubenswrapper[5008]: I0318 18:24:02.788233 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerDied","Data":"7808d4a8a291f22c3970c6e63bf684a08880d110b4cc300ab3fdba275f68472a"} Mar 18 18:24:02 crc kubenswrapper[5008]: I0318 18:24:02.804851 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" event={"ID":"9dab7b89-40a0-4059-b062-6043e4e240b9","Type":"ContainerStarted","Data":"43bdfe87ff924fb55a9249042530f153ad7b50941a95853dffcbfc23040c9059"} Mar 18 18:24:02 crc kubenswrapper[5008]: I0318 18:24:02.806203 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:24:02 crc kubenswrapper[5008]: I0318 18:24:02.806269 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-ff487fff5-mqmcg" podStartSLOduration=7.328300256 podStartE2EDuration="8.806253954s" podCreationTimestamp="2026-03-18 18:23:54 +0000 UTC" firstStartedPulling="2026-03-18 18:24:00.591549275 +0000 UTC m=+1297.111022354" lastFinishedPulling="2026-03-18 18:24:02.069502973 +0000 UTC m=+1298.588976052" observedRunningTime="2026-03-18 18:24:02.802880135 +0000 UTC m=+1299.322353234" watchObservedRunningTime="2026-03-18 18:24:02.806253954 +0000 UTC m=+1299.325727033" Mar 18 18:24:02 crc kubenswrapper[5008]: I0318 18:24:02.840965 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" podStartSLOduration=8.840948506 podStartE2EDuration="8.840948506s" podCreationTimestamp="2026-03-18 18:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:02.830977054 +0000 UTC m=+1299.350450133" watchObservedRunningTime="2026-03-18 18:24:02.840948506 +0000 UTC m=+1299.360421575" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.447543 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.512851 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-run-httpd\") pod \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.513128 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-config-data\") pod \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.513220 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-combined-ca-bundle\") pod \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.513736 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-scripts\") pod \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.513799 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-log-httpd\") pod \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.513879 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-sg-core-conf-yaml\") pod \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.513976 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpqj8\" (UniqueName: \"kubernetes.io/projected/a81efd6f-d370-4c91-9343-75f6e6d1e85d-kube-api-access-bpqj8\") pod \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\" (UID: \"a81efd6f-d370-4c91-9343-75f6e6d1e85d\") " Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.514062 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a81efd6f-d370-4c91-9343-75f6e6d1e85d" (UID: "a81efd6f-d370-4c91-9343-75f6e6d1e85d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.514502 5008 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.514502 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a81efd6f-d370-4c91-9343-75f6e6d1e85d" (UID: "a81efd6f-d370-4c91-9343-75f6e6d1e85d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.519732 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81efd6f-d370-4c91-9343-75f6e6d1e85d-kube-api-access-bpqj8" (OuterVolumeSpecName: "kube-api-access-bpqj8") pod "a81efd6f-d370-4c91-9343-75f6e6d1e85d" (UID: "a81efd6f-d370-4c91-9343-75f6e6d1e85d"). InnerVolumeSpecName "kube-api-access-bpqj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.519777 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-scripts" (OuterVolumeSpecName: "scripts") pod "a81efd6f-d370-4c91-9343-75f6e6d1e85d" (UID: "a81efd6f-d370-4c91-9343-75f6e6d1e85d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.552958 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a81efd6f-d370-4c91-9343-75f6e6d1e85d" (UID: "a81efd6f-d370-4c91-9343-75f6e6d1e85d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.593067 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a81efd6f-d370-4c91-9343-75f6e6d1e85d" (UID: "a81efd6f-d370-4c91-9343-75f6e6d1e85d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.616726 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.616768 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.616780 5008 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a81efd6f-d370-4c91-9343-75f6e6d1e85d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.616790 5008 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.616801 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpqj8\" (UniqueName: \"kubernetes.io/projected/a81efd6f-d370-4c91-9343-75f6e6d1e85d-kube-api-access-bpqj8\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.633692 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-config-data" (OuterVolumeSpecName: "config-data") pod "a81efd6f-d370-4c91-9343-75f6e6d1e85d" (UID: "a81efd6f-d370-4c91-9343-75f6e6d1e85d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.719434 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81efd6f-d370-4c91-9343-75f6e6d1e85d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.815315 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a81efd6f-d370-4c91-9343-75f6e6d1e85d","Type":"ContainerDied","Data":"f520d91ca33efbade783543f5a1157f1b6b9c16207e72d2804145ad3fed49481"} Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.815373 5008 scope.go:117] "RemoveContainer" containerID="078d226d1804ff7c42c51e4c69d33a168d917fabc1c607596e4d259a6fe1966f" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.815509 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.828450 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" event={"ID":"24a03e07-237e-4583-81b4-8d9aadc76ea3","Type":"ContainerStarted","Data":"dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5"} Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.828497 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" event={"ID":"24a03e07-237e-4583-81b4-8d9aadc76ea3","Type":"ContainerStarted","Data":"b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686"} Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.840256 5008 generic.go:334] "Generic (PLEG): container finished" podID="117e64db-91f2-46f2-872e-2edba77b07d9" containerID="a90c2927967c80cd7fe00d77b140f18b6966386ce76f1ff72c6c5b909aca51ee" exitCode=0 Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.840619 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-d5g9j" event={"ID":"117e64db-91f2-46f2-872e-2edba77b07d9","Type":"ContainerDied","Data":"a90c2927967c80cd7fe00d77b140f18b6966386ce76f1ff72c6c5b909aca51ee"} Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.851056 5008 scope.go:117] "RemoveContainer" containerID="d6455c2ff064befd66069965986d13aba7a8fe9b2ee23a975978a103770cd133" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.860722 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" podStartSLOduration=7.254932566 podStartE2EDuration="9.860694747s" podCreationTimestamp="2026-03-18 18:23:54 +0000 UTC" firstStartedPulling="2026-03-18 18:24:00.474470107 +0000 UTC m=+1296.993943186" lastFinishedPulling="2026-03-18 18:24:03.080232288 +0000 UTC m=+1299.599705367" observedRunningTime="2026-03-18 18:24:03.85016552 +0000 UTC m=+1300.369638609" watchObservedRunningTime="2026-03-18 18:24:03.860694747 +0000 UTC m=+1300.380167866" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.876403 5008 scope.go:117] "RemoveContainer" containerID="7808d4a8a291f22c3970c6e63bf684a08880d110b4cc300ab3fdba275f68472a" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.896935 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.912924 5008 scope.go:117] "RemoveContainer" containerID="a00d2e33d5a611eb348a15725223f5f013cb2fd357ace420a3b26bc02145ca80" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.917152 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.930623 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:03 crc kubenswrapper[5008]: E0318 18:24:03.931097 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="ceilometer-central-agent" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.931121 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="ceilometer-central-agent" Mar 18 18:24:03 crc kubenswrapper[5008]: E0318 18:24:03.931150 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="sg-core" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.931159 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="sg-core" Mar 18 18:24:03 crc kubenswrapper[5008]: E0318 18:24:03.931188 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="ceilometer-notification-agent" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.931196 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="ceilometer-notification-agent" Mar 18 18:24:03 crc kubenswrapper[5008]: E0318 18:24:03.931208 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="proxy-httpd" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.931215 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="proxy-httpd" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.931447 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="ceilometer-central-agent" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.931461 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="sg-core" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.931475 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="proxy-httpd" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.931488 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" containerName="ceilometer-notification-agent" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.933456 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.936067 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.945605 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:24:03 crc kubenswrapper[5008]: I0318 18:24:03.945858 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.629535 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81efd6f-d370-4c91-9343-75f6e6d1e85d" path="/var/lib/kubelet/pods/a81efd6f-d370-4c91-9343-75f6e6d1e85d/volumes" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.701698 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.701977 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-log-httpd\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.702146 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9kn\" (UniqueName: \"kubernetes.io/projected/6954c220-010e-4046-8592-192a131fe488-kube-api-access-4b9kn\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.702294 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-config-data\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.702610 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.702900 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-run-httpd\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.703042 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-scripts\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.805360 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9kn\" (UniqueName: \"kubernetes.io/projected/6954c220-010e-4046-8592-192a131fe488-kube-api-access-4b9kn\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.805440 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-config-data\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.805512 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.805653 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-run-httpd\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.805693 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-scripts\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.805788 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.805870 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-log-httpd\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.809673 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-log-httpd\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.810445 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-run-httpd\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.813541 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-scripts\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.837312 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.841470 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.841777 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9kn\" (UniqueName: \"kubernetes.io/projected/6954c220-010e-4046-8592-192a131fe488-kube-api-access-4b9kn\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.843502 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-config-data\") pod \"ceilometer-0\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " pod="openstack/ceilometer-0" Mar 18 18:24:04 crc kubenswrapper[5008]: I0318 18:24:04.980985 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:05 crc kubenswrapper[5008]: I0318 18:24:05.311799 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-d5g9j" Mar 18 18:24:05 crc kubenswrapper[5008]: I0318 18:24:05.418438 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgxvq\" (UniqueName: \"kubernetes.io/projected/117e64db-91f2-46f2-872e-2edba77b07d9-kube-api-access-hgxvq\") pod \"117e64db-91f2-46f2-872e-2edba77b07d9\" (UID: \"117e64db-91f2-46f2-872e-2edba77b07d9\") " Mar 18 18:24:05 crc kubenswrapper[5008]: I0318 18:24:05.422146 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117e64db-91f2-46f2-872e-2edba77b07d9-kube-api-access-hgxvq" (OuterVolumeSpecName: "kube-api-access-hgxvq") pod "117e64db-91f2-46f2-872e-2edba77b07d9" (UID: "117e64db-91f2-46f2-872e-2edba77b07d9"). InnerVolumeSpecName "kube-api-access-hgxvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:05 crc kubenswrapper[5008]: I0318 18:24:05.521410 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgxvq\" (UniqueName: \"kubernetes.io/projected/117e64db-91f2-46f2-872e-2edba77b07d9-kube-api-access-hgxvq\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:05 crc kubenswrapper[5008]: I0318 18:24:05.558023 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:05 crc kubenswrapper[5008]: W0318 18:24:05.558996 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6954c220_010e_4046_8592_192a131fe488.slice/crio-77f7af1430a5f5b658ae5594806c415cf068d71b17f4bb76df24cc3cc79eb776 WatchSource:0}: Error finding container 77f7af1430a5f5b658ae5594806c415cf068d71b17f4bb76df24cc3cc79eb776: Status 404 returned error can't find the container with id 77f7af1430a5f5b658ae5594806c415cf068d71b17f4bb76df24cc3cc79eb776 Mar 18 18:24:05 crc kubenswrapper[5008]: I0318 18:24:05.859992 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerStarted","Data":"77f7af1430a5f5b658ae5594806c415cf068d71b17f4bb76df24cc3cc79eb776"} Mar 18 18:24:05 crc kubenswrapper[5008]: I0318 18:24:05.861425 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564304-d5g9j" event={"ID":"117e64db-91f2-46f2-872e-2edba77b07d9","Type":"ContainerDied","Data":"490feae64d61603429fe23dd56b207a3635c34548e1b8acda3a441ad424941f5"} Mar 18 18:24:05 crc kubenswrapper[5008]: I0318 18:24:05.861453 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490feae64d61603429fe23dd56b207a3635c34548e1b8acda3a441ad424941f5" Mar 18 18:24:05 crc kubenswrapper[5008]: I0318 18:24:05.861639 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564304-d5g9j" Mar 18 18:24:06 crc kubenswrapper[5008]: I0318 18:24:06.376149 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-gcwk6"] Mar 18 18:24:06 crc kubenswrapper[5008]: I0318 18:24:06.391343 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564298-gcwk6"] Mar 18 18:24:06 crc kubenswrapper[5008]: I0318 18:24:06.871230 5008 generic.go:334] "Generic (PLEG): container finished" podID="bed54cd2-a411-4362-a7b1-7fab16ba8b6b" containerID="a7ce89506274663821070b5c6dad5f8257699a1427ed13a75372ea3757a9a56c" exitCode=0 Mar 18 18:24:06 crc kubenswrapper[5008]: I0318 18:24:06.871270 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jbx2h" event={"ID":"bed54cd2-a411-4362-a7b1-7fab16ba8b6b","Type":"ContainerDied","Data":"a7ce89506274663821070b5c6dad5f8257699a1427ed13a75372ea3757a9a56c"} Mar 18 18:24:07 crc kubenswrapper[5008]: I0318 18:24:07.030065 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:24:07 crc kubenswrapper[5008]: I0318 18:24:07.096805 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.208970 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e9e042-217a-40fe-a7c7-1d63a37cf3de" path="/var/lib/kubelet/pods/14e9e042-217a-40fe-a7c7-1d63a37cf3de/volumes" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.326308 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.508671 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-combined-ca-bundle\") pod \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.509038 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-db-sync-config-data\") pod \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.509166 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl2gj\" (UniqueName: \"kubernetes.io/projected/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-kube-api-access-zl2gj\") pod \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.509280 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-config-data\") pod \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.509382 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-etc-machine-id\") pod \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.509481 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-scripts\") pod \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\" (UID: \"bed54cd2-a411-4362-a7b1-7fab16ba8b6b\") " Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.509495 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bed54cd2-a411-4362-a7b1-7fab16ba8b6b" (UID: "bed54cd2-a411-4362-a7b1-7fab16ba8b6b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.510035 5008 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.515902 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bed54cd2-a411-4362-a7b1-7fab16ba8b6b" (UID: "bed54cd2-a411-4362-a7b1-7fab16ba8b6b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.516231 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-kube-api-access-zl2gj" (OuterVolumeSpecName: "kube-api-access-zl2gj") pod "bed54cd2-a411-4362-a7b1-7fab16ba8b6b" (UID: "bed54cd2-a411-4362-a7b1-7fab16ba8b6b"). InnerVolumeSpecName "kube-api-access-zl2gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.519765 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-scripts" (OuterVolumeSpecName: "scripts") pod "bed54cd2-a411-4362-a7b1-7fab16ba8b6b" (UID: "bed54cd2-a411-4362-a7b1-7fab16ba8b6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.519851 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.606939 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-config-data" (OuterVolumeSpecName: "config-data") pod "bed54cd2-a411-4362-a7b1-7fab16ba8b6b" (UID: "bed54cd2-a411-4362-a7b1-7fab16ba8b6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.612785 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.612815 5008 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.612827 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl2gj\" (UniqueName: \"kubernetes.io/projected/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-kube-api-access-zl2gj\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.612840 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.638163 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bed54cd2-a411-4362-a7b1-7fab16ba8b6b" (UID: "bed54cd2-a411-4362-a7b1-7fab16ba8b6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.714521 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed54cd2-a411-4362-a7b1-7fab16ba8b6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.777268 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c7c77574f-9xjzj"] Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.777515 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c7c77574f-9xjzj" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-api" containerID="cri-o://d8321e3a7df103240068788e10312c916b5bd1e5358b1b6141e98b2af6e2420b" gracePeriod=30 Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.778956 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c7c77574f-9xjzj" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-httpd" containerID="cri-o://9c613ccb6c89ec32c5078a70b44fae2c00bb3b09aa99e018dbbd081bec4cccf3" gracePeriod=30 Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.799021 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-c7c77574f-9xjzj" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9696/\": EOF" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.839336 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-85f9f77dc-mg4p7"] Mar 18 18:24:08 crc kubenswrapper[5008]: E0318 18:24:08.839807 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117e64db-91f2-46f2-872e-2edba77b07d9" containerName="oc" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.839831 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="117e64db-91f2-46f2-872e-2edba77b07d9" containerName="oc" Mar 18 18:24:08 crc kubenswrapper[5008]: E0318 18:24:08.839866 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed54cd2-a411-4362-a7b1-7fab16ba8b6b" containerName="cinder-db-sync" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.839874 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed54cd2-a411-4362-a7b1-7fab16ba8b6b" containerName="cinder-db-sync" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.840087 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="117e64db-91f2-46f2-872e-2edba77b07d9" containerName="oc" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.840121 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed54cd2-a411-4362-a7b1-7fab16ba8b6b" containerName="cinder-db-sync" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.841218 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.895267 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jbx2h" event={"ID":"bed54cd2-a411-4362-a7b1-7fab16ba8b6b","Type":"ContainerDied","Data":"a82bf8d71775861ab06ef06297311574a5993f9c7bfe1583586d2e6b7d097e89"} Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.895307 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a82bf8d71775861ab06ef06297311574a5993f9c7bfe1583586d2e6b7d097e89" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.895397 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jbx2h" Mar 18 18:24:08 crc kubenswrapper[5008]: I0318 18:24:08.949070 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85f9f77dc-mg4p7"] Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.019455 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-ovndb-tls-certs\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.019516 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-httpd-config\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.019684 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9lfg\" (UniqueName: \"kubernetes.io/projected/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-kube-api-access-t9lfg\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.019811 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-combined-ca-bundle\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.019884 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-public-tls-certs\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.019956 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-internal-tls-certs\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.020024 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-config\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.121065 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-internal-tls-certs\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.121132 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-config\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.121154 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-ovndb-tls-certs\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.121187 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-httpd-config\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.121231 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9lfg\" (UniqueName: \"kubernetes.io/projected/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-kube-api-access-t9lfg\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.121271 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-combined-ca-bundle\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.121306 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-public-tls-certs\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.129163 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-ovndb-tls-certs\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.129375 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-httpd-config\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.129412 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-combined-ca-bundle\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.129905 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-public-tls-certs\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.134254 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-config\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.134392 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-internal-tls-certs\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.189923 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.191334 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.200897 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.200950 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.201144 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.201263 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-k8qml" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.208617 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.213274 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9lfg\" (UniqueName: \"kubernetes.io/projected/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-kube-api-access-t9lfg\") pod \"neutron-85f9f77dc-mg4p7\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.241467 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.241530 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s75qn\" (UniqueName: \"kubernetes.io/projected/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-kube-api-access-s75qn\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.241618 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.241657 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.241706 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.241745 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.300443 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-kvf5t"] Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.300978 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" podUID="9dab7b89-40a0-4059-b062-6043e4e240b9" containerName="dnsmasq-dns" containerID="cri-o://43bdfe87ff924fb55a9249042530f153ad7b50941a95853dffcbfc23040c9059" gracePeriod=10 Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.305703 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.343609 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.343671 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.343718 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.343749 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s75qn\" (UniqueName: \"kubernetes.io/projected/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-kube-api-access-s75qn\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.343795 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.343820 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.344996 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.347733 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.368732 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.371734 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.374002 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.384665 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s75qn\" (UniqueName: \"kubernetes.io/projected/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-kube-api-access-s75qn\") pod \"cinder-scheduler-0\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.396623 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-snzkk"] Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.398394 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.410295 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-snzkk"] Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.452103 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-config\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.452180 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8cx6\" (UniqueName: \"kubernetes.io/projected/fe68e762-c3c6-46d4-a897-c254828dd808-kube-api-access-m8cx6\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.452297 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-sb\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.452380 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-nb\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.452501 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-swift-storage-0\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.452620 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-svc\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.466118 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.516016 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.517367 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.522066 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.529577 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.533265 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555285 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-swift-storage-0\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555346 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9646d282-d3bf-418e-ba39-fc1766e9de27-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555374 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-svc\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555420 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-config\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555453 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8cx6\" (UniqueName: \"kubernetes.io/projected/fe68e762-c3c6-46d4-a897-c254828dd808-kube-api-access-m8cx6\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555479 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555512 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555534 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646d282-d3bf-418e-ba39-fc1766e9de27-logs\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555675 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-sb\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555695 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data-custom\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555736 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwbk\" (UniqueName: \"kubernetes.io/projected/9646d282-d3bf-418e-ba39-fc1766e9de27-kube-api-access-5xwbk\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555757 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-nb\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.555787 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-scripts\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.557747 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-swift-storage-0\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.557930 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-svc\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.558358 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-sb\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.561043 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-config\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.561065 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-nb\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.581975 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8cx6\" (UniqueName: \"kubernetes.io/projected/fe68e762-c3c6-46d4-a897-c254828dd808-kube-api-access-m8cx6\") pod \"dnsmasq-dns-8995fbb57-snzkk\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.637639 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.658286 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646d282-d3bf-418e-ba39-fc1766e9de27-logs\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.658649 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data-custom\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.658691 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwbk\" (UniqueName: \"kubernetes.io/projected/9646d282-d3bf-418e-ba39-fc1766e9de27-kube-api-access-5xwbk\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.658729 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-scripts\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.658785 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9646d282-d3bf-418e-ba39-fc1766e9de27-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.658840 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.658861 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.662758 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646d282-d3bf-418e-ba39-fc1766e9de27-logs\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.663519 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9646d282-d3bf-418e-ba39-fc1766e9de27-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.674727 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data-custom\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.681804 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-scripts\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.682724 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.683202 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.696090 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwbk\" (UniqueName: \"kubernetes.io/projected/9646d282-d3bf-418e-ba39-fc1766e9de27-kube-api-access-5xwbk\") pod \"cinder-api-0\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " pod="openstack/cinder-api-0" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.928311 5008 generic.go:334] "Generic (PLEG): container finished" podID="9dab7b89-40a0-4059-b062-6043e4e240b9" containerID="43bdfe87ff924fb55a9249042530f153ad7b50941a95853dffcbfc23040c9059" exitCode=0 Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.928350 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" event={"ID":"9dab7b89-40a0-4059-b062-6043e4e240b9","Type":"ContainerDied","Data":"43bdfe87ff924fb55a9249042530f153ad7b50941a95853dffcbfc23040c9059"} Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.942948 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:24:09 crc kubenswrapper[5008]: I0318 18:24:09.952305 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.075157 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-config\") pod \"9dab7b89-40a0-4059-b062-6043e4e240b9\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.075537 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrckn\" (UniqueName: \"kubernetes.io/projected/9dab7b89-40a0-4059-b062-6043e4e240b9-kube-api-access-qrckn\") pod \"9dab7b89-40a0-4059-b062-6043e4e240b9\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.075570 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-svc\") pod \"9dab7b89-40a0-4059-b062-6043e4e240b9\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.075624 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-swift-storage-0\") pod \"9dab7b89-40a0-4059-b062-6043e4e240b9\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.075737 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-sb\") pod \"9dab7b89-40a0-4059-b062-6043e4e240b9\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.075822 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-nb\") pod \"9dab7b89-40a0-4059-b062-6043e4e240b9\" (UID: \"9dab7b89-40a0-4059-b062-6043e4e240b9\") " Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.085758 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dab7b89-40a0-4059-b062-6043e4e240b9-kube-api-access-qrckn" (OuterVolumeSpecName: "kube-api-access-qrckn") pod "9dab7b89-40a0-4059-b062-6043e4e240b9" (UID: "9dab7b89-40a0-4059-b062-6043e4e240b9"). InnerVolumeSpecName "kube-api-access-qrckn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.152282 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9dab7b89-40a0-4059-b062-6043e4e240b9" (UID: "9dab7b89-40a0-4059-b062-6043e4e240b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.169126 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9dab7b89-40a0-4059-b062-6043e4e240b9" (UID: "9dab7b89-40a0-4059-b062-6043e4e240b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.179563 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrckn\" (UniqueName: \"kubernetes.io/projected/9dab7b89-40a0-4059-b062-6043e4e240b9-kube-api-access-qrckn\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.179588 5008 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.179599 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.188169 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-snzkk"] Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.226667 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9dab7b89-40a0-4059-b062-6043e4e240b9" (UID: "9dab7b89-40a0-4059-b062-6043e4e240b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.239374 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9dab7b89-40a0-4059-b062-6043e4e240b9" (UID: "9dab7b89-40a0-4059-b062-6043e4e240b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.250983 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-config" (OuterVolumeSpecName: "config") pod "9dab7b89-40a0-4059-b062-6043e4e240b9" (UID: "9dab7b89-40a0-4059-b062-6043e4e240b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.280700 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.280722 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.280730 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dab7b89-40a0-4059-b062-6043e4e240b9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.291702 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.387388 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85f9f77dc-mg4p7"] Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.455689 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.950908 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerStarted","Data":"0ccf73214d6197a655f3da448eb3396b2393efd78a8580f0b1a9df56d850bb15"} Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.955776 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9cb5e1e-6917-4864-9b33-ed7de36e11ec","Type":"ContainerStarted","Data":"fc97f8a2bfe1c9976c4bc8afae44bc7eeb6c295dd2ff5841e43db3d0f13e0746"} Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.957762 5008 generic.go:334] "Generic (PLEG): container finished" podID="fe68e762-c3c6-46d4-a897-c254828dd808" containerID="e42d030eac9288a107b6bca15738bed5ae0d85520b780e690d3fa617e74d46cc" exitCode=0 Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.957835 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" event={"ID":"fe68e762-c3c6-46d4-a897-c254828dd808","Type":"ContainerDied","Data":"e42d030eac9288a107b6bca15738bed5ae0d85520b780e690d3fa617e74d46cc"} Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.957865 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" event={"ID":"fe68e762-c3c6-46d4-a897-c254828dd808","Type":"ContainerStarted","Data":"f31a77ff8c99f25b89a5ee6bcdc603c0567df4c8db1528ba553951ba6fd64a9b"} Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.965066 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" event={"ID":"9dab7b89-40a0-4059-b062-6043e4e240b9","Type":"ContainerDied","Data":"009ec75666c6da057e191ef82448f1a849d5c063ad77e8f2bcd0e160c4111dc2"} Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.965114 5008 scope.go:117] "RemoveContainer" containerID="43bdfe87ff924fb55a9249042530f153ad7b50941a95853dffcbfc23040c9059" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.965297 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df4c9958f-kvf5t" Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.970903 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9646d282-d3bf-418e-ba39-fc1766e9de27","Type":"ContainerStarted","Data":"47be91260d741e9ea5f0205bea8e7914a4d5882382f4b29ed92fa04d507edfba"} Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.975694 5008 generic.go:334] "Generic (PLEG): container finished" podID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerID="9c613ccb6c89ec32c5078a70b44fae2c00bb3b09aa99e018dbbd081bec4cccf3" exitCode=0 Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.975762 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7c77574f-9xjzj" event={"ID":"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d","Type":"ContainerDied","Data":"9c613ccb6c89ec32c5078a70b44fae2c00bb3b09aa99e018dbbd081bec4cccf3"} Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.984069 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f9f77dc-mg4p7" event={"ID":"0c9299b1-8e15-4e9c-bada-ce88af9c1c28","Type":"ContainerStarted","Data":"37ee661f7953b8d9a32a6d1f71d0668b96eefbaeef42d315f3b9741a68892653"} Mar 18 18:24:10 crc kubenswrapper[5008]: I0318 18:24:10.984139 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f9f77dc-mg4p7" event={"ID":"0c9299b1-8e15-4e9c-bada-ce88af9c1c28","Type":"ContainerStarted","Data":"9f24721f654fa721da03e8ccd3a5475770347df8a8b67e723417237b01c2f097"} Mar 18 18:24:11 crc kubenswrapper[5008]: I0318 18:24:10.999823 5008 scope.go:117] "RemoveContainer" containerID="845927cd51cbb95f2e8ece82288539f3e9978b858292c645a99783cab6667fdc" Mar 18 18:24:11 crc kubenswrapper[5008]: I0318 18:24:11.035650 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-kvf5t"] Mar 18 18:24:11 crc kubenswrapper[5008]: I0318 18:24:11.046353 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7df4c9958f-kvf5t"] Mar 18 18:24:11 crc kubenswrapper[5008]: I0318 18:24:11.096936 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-c7c77574f-9xjzj" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9696/\": dial tcp 10.217.0.153:9696: connect: connection refused" Mar 18 18:24:11 crc kubenswrapper[5008]: I0318 18:24:11.343862 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:24:11 crc kubenswrapper[5008]: I0318 18:24:11.955175 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.005538 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerStarted","Data":"2c04800848f603732220b19ce0fce3517825fd4a6b4652f7d4e4e4b262be4ed7"} Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.025346 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d95565c88-9t6tp"] Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.025609 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d95565c88-9t6tp" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api-log" containerID="cri-o://aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299" gracePeriod=30 Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.025718 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d95565c88-9t6tp" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api" containerID="cri-o://4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9" gracePeriod=30 Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.031857 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" event={"ID":"fe68e762-c3c6-46d4-a897-c254828dd808","Type":"ContainerStarted","Data":"a8c0850c31f1fbc0a4c6c047087e854ab6daf0542dc068d89a43baf1fc032648"} Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.032789 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.045749 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d95565c88-9t6tp" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": EOF" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.045863 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7d95565c88-9t6tp" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": EOF" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.046020 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7d95565c88-9t6tp" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": EOF" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.050506 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9646d282-d3bf-418e-ba39-fc1766e9de27","Type":"ContainerStarted","Data":"268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67"} Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.053758 5008 generic.go:334] "Generic (PLEG): container finished" podID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerID="d8321e3a7df103240068788e10312c916b5bd1e5358b1b6141e98b2af6e2420b" exitCode=0 Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.053829 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7c77574f-9xjzj" event={"ID":"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d","Type":"ContainerDied","Data":"d8321e3a7df103240068788e10312c916b5bd1e5358b1b6141e98b2af6e2420b"} Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.055645 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f9f77dc-mg4p7" event={"ID":"0c9299b1-8e15-4e9c-bada-ce88af9c1c28","Type":"ContainerStarted","Data":"c22d63804d2fa8eaa1661c6774af139f50fcab700ab10feba4318bb64f3859aa"} Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.056366 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.075833 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" podStartSLOduration=3.075777967 podStartE2EDuration="3.075777967s" podCreationTimestamp="2026-03-18 18:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:12.068780113 +0000 UTC m=+1308.588253192" watchObservedRunningTime="2026-03-18 18:24:12.075777967 +0000 UTC m=+1308.595251056" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.114260 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-85f9f77dc-mg4p7" podStartSLOduration=4.114234618 podStartE2EDuration="4.114234618s" podCreationTimestamp="2026-03-18 18:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:12.092011114 +0000 UTC m=+1308.611484223" watchObservedRunningTime="2026-03-18 18:24:12.114234618 +0000 UTC m=+1308.633707697" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.216091 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dab7b89-40a0-4059-b062-6043e4e240b9" path="/var/lib/kubelet/pods/9dab7b89-40a0-4059-b062-6043e4e240b9/volumes" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.470411 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.586148 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-internal-tls-certs\") pod \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.586250 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-config\") pod \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.586279 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-public-tls-certs\") pod \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.586315 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-combined-ca-bundle\") pod \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.586395 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-ovndb-tls-certs\") pod \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.586423 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-httpd-config\") pod \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.586448 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kkml\" (UniqueName: \"kubernetes.io/projected/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-kube-api-access-4kkml\") pod \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\" (UID: \"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d\") " Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.603735 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" (UID: "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.604260 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-kube-api-access-4kkml" (OuterVolumeSpecName: "kube-api-access-4kkml") pod "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" (UID: "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d"). InnerVolumeSpecName "kube-api-access-4kkml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.650306 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.691170 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.691221 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kkml\" (UniqueName: \"kubernetes.io/projected/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-kube-api-access-4kkml\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.731705 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" (UID: "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.738853 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-config" (OuterVolumeSpecName: "config") pod "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" (UID: "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.752772 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" (UID: "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.763869 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" (UID: "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.794270 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.794299 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.794309 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.794337 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.804913 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" (UID: "e5a60d5e-70ec-4e1b-b025-2b6af6b1338d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:12 crc kubenswrapper[5008]: I0318 18:24:12.896157 5008 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.065706 5008 generic.go:334] "Generic (PLEG): container finished" podID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerID="aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299" exitCode=143 Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.065776 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d95565c88-9t6tp" event={"ID":"56e82b67-9249-4e21-8da6-3138fddcff0e","Type":"ContainerDied","Data":"aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299"} Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.068220 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9646d282-d3bf-418e-ba39-fc1766e9de27","Type":"ContainerStarted","Data":"872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932"} Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.073164 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerName="cinder-api-log" containerID="cri-o://268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67" gracePeriod=30 Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.073826 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerName="cinder-api" containerID="cri-o://872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932" gracePeriod=30 Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.121688 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.121668396 podStartE2EDuration="4.121668396s" podCreationTimestamp="2026-03-18 18:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:13.112252748 +0000 UTC m=+1309.631725827" watchObservedRunningTime="2026-03-18 18:24:13.121668396 +0000 UTC m=+1309.641141475" Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.122819 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7c77574f-9xjzj" event={"ID":"e5a60d5e-70ec-4e1b-b025-2b6af6b1338d","Type":"ContainerDied","Data":"b1a0743d568bf4af006d3eba3792f5fbd51a4e0d597f352cc0fb534f9ad43132"} Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.122879 5008 scope.go:117] "RemoveContainer" containerID="9c613ccb6c89ec32c5078a70b44fae2c00bb3b09aa99e018dbbd081bec4cccf3" Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.123048 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7c77574f-9xjzj" Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.156315 5008 scope.go:117] "RemoveContainer" containerID="d8321e3a7df103240068788e10312c916b5bd1e5358b1b6141e98b2af6e2420b" Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.170727 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerStarted","Data":"37b3097e7eb31a482b5d14557b6615ab2d05394d59ca4e08adba6e10955b1241"} Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.201644 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c7c77574f-9xjzj"] Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.210057 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c7c77574f-9xjzj"] Mar 18 18:24:13 crc kubenswrapper[5008]: E0318 18:24:13.395831 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9646d282_d3bf_418e_ba39_fc1766e9de27.slice/crio-conmon-268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9646d282_d3bf_418e_ba39_fc1766e9de27.slice/crio-268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a60d5e_70ec_4e1b_b025_2b6af6b1338d.slice/crio-b1a0743d568bf4af006d3eba3792f5fbd51a4e0d597f352cc0fb534f9ad43132\": RecentStats: unable to find data in memory cache]" Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.475004 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.807172 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:24:13 crc kubenswrapper[5008]: I0318 18:24:13.971179 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.045738 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xwbk\" (UniqueName: \"kubernetes.io/projected/9646d282-d3bf-418e-ba39-fc1766e9de27-kube-api-access-5xwbk\") pod \"9646d282-d3bf-418e-ba39-fc1766e9de27\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.045817 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9646d282-d3bf-418e-ba39-fc1766e9de27-etc-machine-id\") pod \"9646d282-d3bf-418e-ba39-fc1766e9de27\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.045865 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-combined-ca-bundle\") pod \"9646d282-d3bf-418e-ba39-fc1766e9de27\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.045899 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-scripts\") pod \"9646d282-d3bf-418e-ba39-fc1766e9de27\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.045975 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data-custom\") pod \"9646d282-d3bf-418e-ba39-fc1766e9de27\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.046001 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data\") pod \"9646d282-d3bf-418e-ba39-fc1766e9de27\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.046078 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646d282-d3bf-418e-ba39-fc1766e9de27-logs\") pod \"9646d282-d3bf-418e-ba39-fc1766e9de27\" (UID: \"9646d282-d3bf-418e-ba39-fc1766e9de27\") " Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.047000 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9646d282-d3bf-418e-ba39-fc1766e9de27-logs" (OuterVolumeSpecName: "logs") pod "9646d282-d3bf-418e-ba39-fc1766e9de27" (UID: "9646d282-d3bf-418e-ba39-fc1766e9de27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.048844 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9646d282-d3bf-418e-ba39-fc1766e9de27-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9646d282-d3bf-418e-ba39-fc1766e9de27" (UID: "9646d282-d3bf-418e-ba39-fc1766e9de27"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.059871 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-scripts" (OuterVolumeSpecName: "scripts") pod "9646d282-d3bf-418e-ba39-fc1766e9de27" (UID: "9646d282-d3bf-418e-ba39-fc1766e9de27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.064924 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5b86568468-vhc29"] Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.065727 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9646d282-d3bf-418e-ba39-fc1766e9de27" (UID: "9646d282-d3bf-418e-ba39-fc1766e9de27"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:14 crc kubenswrapper[5008]: E0318 18:24:14.065908 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dab7b89-40a0-4059-b062-6043e4e240b9" containerName="init" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.065928 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dab7b89-40a0-4059-b062-6043e4e240b9" containerName="init" Mar 18 18:24:14 crc kubenswrapper[5008]: E0318 18:24:14.065953 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerName="cinder-api-log" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.065960 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerName="cinder-api-log" Mar 18 18:24:14 crc kubenswrapper[5008]: E0318 18:24:14.065972 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-api" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.065978 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-api" Mar 18 18:24:14 crc kubenswrapper[5008]: E0318 18:24:14.065997 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-httpd" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.066003 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-httpd" Mar 18 18:24:14 crc kubenswrapper[5008]: E0318 18:24:14.066015 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerName="cinder-api" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.066020 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerName="cinder-api" Mar 18 18:24:14 crc kubenswrapper[5008]: E0318 18:24:14.066035 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dab7b89-40a0-4059-b062-6043e4e240b9" containerName="dnsmasq-dns" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.066041 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dab7b89-40a0-4059-b062-6043e4e240b9" containerName="dnsmasq-dns" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.066202 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-api" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.066216 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" containerName="neutron-httpd" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.066228 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dab7b89-40a0-4059-b062-6043e4e240b9" containerName="dnsmasq-dns" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.066241 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerName="cinder-api-log" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.066258 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerName="cinder-api" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.076633 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.092028 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9646d282-d3bf-418e-ba39-fc1766e9de27-kube-api-access-5xwbk" (OuterVolumeSpecName: "kube-api-access-5xwbk") pod "9646d282-d3bf-418e-ba39-fc1766e9de27" (UID: "9646d282-d3bf-418e-ba39-fc1766e9de27"). InnerVolumeSpecName "kube-api-access-5xwbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.097826 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b86568468-vhc29"] Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.121532 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9646d282-d3bf-418e-ba39-fc1766e9de27" (UID: "9646d282-d3bf-418e-ba39-fc1766e9de27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152296 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-internal-tls-certs\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152339 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-config-data\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152360 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-combined-ca-bundle\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152387 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-scripts\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152467 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-logs\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152494 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-public-tls-certs\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152514 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd649\" (UniqueName: \"kubernetes.io/projected/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-kube-api-access-sd649\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152631 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152642 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646d282-d3bf-418e-ba39-fc1766e9de27-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152651 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xwbk\" (UniqueName: \"kubernetes.io/projected/9646d282-d3bf-418e-ba39-fc1766e9de27-kube-api-access-5xwbk\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152661 5008 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9646d282-d3bf-418e-ba39-fc1766e9de27-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152669 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.152677 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.186229 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data" (OuterVolumeSpecName: "config-data") pod "9646d282-d3bf-418e-ba39-fc1766e9de27" (UID: "9646d282-d3bf-418e-ba39-fc1766e9de27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.206927 5008 generic.go:334] "Generic (PLEG): container finished" podID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerID="872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932" exitCode=0 Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.206960 5008 generic.go:334] "Generic (PLEG): container finished" podID="9646d282-d3bf-418e-ba39-fc1766e9de27" containerID="268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67" exitCode=143 Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.207024 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9646d282-d3bf-418e-ba39-fc1766e9de27","Type":"ContainerDied","Data":"872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932"} Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.207050 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9646d282-d3bf-418e-ba39-fc1766e9de27","Type":"ContainerDied","Data":"268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67"} Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.207060 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9646d282-d3bf-418e-ba39-fc1766e9de27","Type":"ContainerDied","Data":"47be91260d741e9ea5f0205bea8e7914a4d5882382f4b29ed92fa04d507edfba"} Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.207079 5008 scope.go:117] "RemoveContainer" containerID="872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.207195 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.231771 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a60d5e-70ec-4e1b-b025-2b6af6b1338d" path="/var/lib/kubelet/pods/e5a60d5e-70ec-4e1b-b025-2b6af6b1338d/volumes" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.254994 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-logs\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.255052 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-public-tls-certs\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.255080 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd649\" (UniqueName: \"kubernetes.io/projected/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-kube-api-access-sd649\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.255162 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-internal-tls-certs\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.255182 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-config-data\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.255219 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-combined-ca-bundle\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.255246 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-scripts\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.255298 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646d282-d3bf-418e-ba39-fc1766e9de27-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.262431 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-scripts\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.264013 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-logs\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.267012 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-public-tls-certs\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.277694 5008 scope.go:117] "RemoveContainer" containerID="268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.289724 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-internal-tls-certs\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.291168 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-combined-ca-bundle\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.298179 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-config-data\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.326163 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd649\" (UniqueName: \"kubernetes.io/projected/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-kube-api-access-sd649\") pod \"placement-5b86568468-vhc29\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.396037 5008 scope.go:117] "RemoveContainer" containerID="872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932" Mar 18 18:24:14 crc kubenswrapper[5008]: E0318 18:24:14.396429 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932\": container with ID starting with 872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932 not found: ID does not exist" containerID="872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.396458 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932"} err="failed to get container status \"872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932\": rpc error: code = NotFound desc = could not find container \"872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932\": container with ID starting with 872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932 not found: ID does not exist" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.396476 5008 scope.go:117] "RemoveContainer" containerID="268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67" Mar 18 18:24:14 crc kubenswrapper[5008]: E0318 18:24:14.396847 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67\": container with ID starting with 268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67 not found: ID does not exist" containerID="268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.396866 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67"} err="failed to get container status \"268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67\": rpc error: code = NotFound desc = could not find container \"268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67\": container with ID starting with 268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67 not found: ID does not exist" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.396879 5008 scope.go:117] "RemoveContainer" containerID="872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.397057 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932"} err="failed to get container status \"872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932\": rpc error: code = NotFound desc = could not find container \"872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932\": container with ID starting with 872747703d7fd4e028f96d70ffae43ac7ae59accb747b88549a90d409c1d8932 not found: ID does not exist" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.397073 5008 scope.go:117] "RemoveContainer" containerID="268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.397399 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67"} err="failed to get container status \"268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67\": rpc error: code = NotFound desc = could not find container \"268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67\": container with ID starting with 268cb720137ba03b1b0092f988861e9fe8a6bd5ae7c5ac47109235bc2ef13f67 not found: ID does not exist" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.399615 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.406325 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.426296 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.427758 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.431728 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.432026 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.432195 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.445504 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.450243 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.460282 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data-custom\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.460354 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.460398 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-scripts\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.460419 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.460435 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cndf6\" (UniqueName: \"kubernetes.io/projected/d27fb392-40df-45a9-aeae-20781d90f02b-kube-api-access-cndf6\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.460451 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.460491 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d27fb392-40df-45a9-aeae-20781d90f02b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.460513 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.460542 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d27fb392-40df-45a9-aeae-20781d90f02b-logs\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.564576 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data-custom\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.564658 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.564711 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-scripts\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.564734 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.564753 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cndf6\" (UniqueName: \"kubernetes.io/projected/d27fb392-40df-45a9-aeae-20781d90f02b-kube-api-access-cndf6\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.564771 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.564821 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d27fb392-40df-45a9-aeae-20781d90f02b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.564854 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.564900 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d27fb392-40df-45a9-aeae-20781d90f02b-logs\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.565339 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d27fb392-40df-45a9-aeae-20781d90f02b-logs\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.570500 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d27fb392-40df-45a9-aeae-20781d90f02b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.582980 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data-custom\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.593789 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-scripts\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.594515 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.594714 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.596214 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cndf6\" (UniqueName: \"kubernetes.io/projected/d27fb392-40df-45a9-aeae-20781d90f02b-kube-api-access-cndf6\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.596229 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.602686 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " pod="openstack/cinder-api-0" Mar 18 18:24:14 crc kubenswrapper[5008]: I0318 18:24:14.747991 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:24:15 crc kubenswrapper[5008]: I0318 18:24:14.997999 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b86568468-vhc29"] Mar 18 18:24:15 crc kubenswrapper[5008]: I0318 18:24:15.231651 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerStarted","Data":"461c5f3e2f88a0958c6cabde8b431b1909ea245780a93267a9dddb0c4b8bd6ee"} Mar 18 18:24:15 crc kubenswrapper[5008]: I0318 18:24:15.232857 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:24:15 crc kubenswrapper[5008]: I0318 18:24:15.233780 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b86568468-vhc29" event={"ID":"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b","Type":"ContainerStarted","Data":"41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8"} Mar 18 18:24:15 crc kubenswrapper[5008]: I0318 18:24:15.233804 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b86568468-vhc29" event={"ID":"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b","Type":"ContainerStarted","Data":"c7a0237bc6f31cf971207a3f582cf6691ec65b493d61916de3897a92a4e4d11e"} Mar 18 18:24:15 crc kubenswrapper[5008]: I0318 18:24:15.343436 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.368937754 podStartE2EDuration="12.343413169s" podCreationTimestamp="2026-03-18 18:24:03 +0000 UTC" firstStartedPulling="2026-03-18 18:24:05.562051419 +0000 UTC m=+1302.081524498" lastFinishedPulling="2026-03-18 18:24:14.536526844 +0000 UTC m=+1311.055999913" observedRunningTime="2026-03-18 18:24:15.331013123 +0000 UTC m=+1311.850486202" watchObservedRunningTime="2026-03-18 18:24:15.343413169 +0000 UTC m=+1311.862886258" Mar 18 18:24:15 crc kubenswrapper[5008]: I0318 18:24:15.381922 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:24:15 crc kubenswrapper[5008]: W0318 18:24:15.475120 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd27fb392_40df_45a9_aeae_20781d90f02b.slice/crio-f9d031e798aae3fcaedcc5d62a3d819b62b8f95da8741dbe5ae4088bfcd14e75 WatchSource:0}: Error finding container f9d031e798aae3fcaedcc5d62a3d819b62b8f95da8741dbe5ae4088bfcd14e75: Status 404 returned error can't find the container with id f9d031e798aae3fcaedcc5d62a3d819b62b8f95da8741dbe5ae4088bfcd14e75 Mar 18 18:24:15 crc kubenswrapper[5008]: I0318 18:24:15.924982 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:24:16 crc kubenswrapper[5008]: I0318 18:24:16.211082 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9646d282-d3bf-418e-ba39-fc1766e9de27" path="/var/lib/kubelet/pods/9646d282-d3bf-418e-ba39-fc1766e9de27/volumes" Mar 18 18:24:16 crc kubenswrapper[5008]: I0318 18:24:16.251903 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9cb5e1e-6917-4864-9b33-ed7de36e11ec","Type":"ContainerStarted","Data":"3b7cc2f123e9419d8f9bfa02e53afcebdc40cf75753a61d13baac48536d1448e"} Mar 18 18:24:16 crc kubenswrapper[5008]: I0318 18:24:16.256018 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d27fb392-40df-45a9-aeae-20781d90f02b","Type":"ContainerStarted","Data":"f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953"} Mar 18 18:24:16 crc kubenswrapper[5008]: I0318 18:24:16.256050 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d27fb392-40df-45a9-aeae-20781d90f02b","Type":"ContainerStarted","Data":"f9d031e798aae3fcaedcc5d62a3d819b62b8f95da8741dbe5ae4088bfcd14e75"} Mar 18 18:24:16 crc kubenswrapper[5008]: I0318 18:24:16.259814 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b86568468-vhc29" event={"ID":"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b","Type":"ContainerStarted","Data":"eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa"} Mar 18 18:24:16 crc kubenswrapper[5008]: I0318 18:24:16.259854 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:16 crc kubenswrapper[5008]: I0318 18:24:16.259881 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:16 crc kubenswrapper[5008]: I0318 18:24:16.280567 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5b86568468-vhc29" podStartSLOduration=2.280540568 podStartE2EDuration="2.280540568s" podCreationTimestamp="2026-03-18 18:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:16.277984421 +0000 UTC m=+1312.797457500" watchObservedRunningTime="2026-03-18 18:24:16.280540568 +0000 UTC m=+1312.800013647" Mar 18 18:24:17 crc kubenswrapper[5008]: I0318 18:24:17.266342 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d27fb392-40df-45a9-aeae-20781d90f02b","Type":"ContainerStarted","Data":"890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2"} Mar 18 18:24:17 crc kubenswrapper[5008]: I0318 18:24:17.266896 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 18:24:17 crc kubenswrapper[5008]: I0318 18:24:17.269419 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9cb5e1e-6917-4864-9b33-ed7de36e11ec","Type":"ContainerStarted","Data":"07944dbb6c5525e196bf06fbf04ef2b9327d96f150930ea3cab6c02daf6619d9"} Mar 18 18:24:17 crc kubenswrapper[5008]: I0318 18:24:17.300341 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.30031749 podStartE2EDuration="3.30031749s" podCreationTimestamp="2026-03-18 18:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:17.286697572 +0000 UTC m=+1313.806170651" watchObservedRunningTime="2026-03-18 18:24:17.30031749 +0000 UTC m=+1313.819790589" Mar 18 18:24:17 crc kubenswrapper[5008]: I0318 18:24:17.321738 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.135127838 podStartE2EDuration="8.321720023s" podCreationTimestamp="2026-03-18 18:24:09 +0000 UTC" firstStartedPulling="2026-03-18 18:24:10.290057947 +0000 UTC m=+1306.809531026" lastFinishedPulling="2026-03-18 18:24:15.476650132 +0000 UTC m=+1311.996123211" observedRunningTime="2026-03-18 18:24:17.314173565 +0000 UTC m=+1313.833646644" watchObservedRunningTime="2026-03-18 18:24:17.321720023 +0000 UTC m=+1313.841193102" Mar 18 18:24:18 crc kubenswrapper[5008]: I0318 18:24:18.448162 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d95565c88-9t6tp" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:50176->10.217.0.159:9311: read: connection reset by peer" Mar 18 18:24:18 crc kubenswrapper[5008]: I0318 18:24:18.448434 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d95565c88-9t6tp" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:50162->10.217.0.159:9311: read: connection reset by peer" Mar 18 18:24:18 crc kubenswrapper[5008]: I0318 18:24:18.925912 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:24:18 crc kubenswrapper[5008]: I0318 18:24:18.994878 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-combined-ca-bundle\") pod \"56e82b67-9249-4e21-8da6-3138fddcff0e\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " Mar 18 18:24:18 crc kubenswrapper[5008]: I0318 18:24:18.994981 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data-custom\") pod \"56e82b67-9249-4e21-8da6-3138fddcff0e\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " Mar 18 18:24:18 crc kubenswrapper[5008]: I0318 18:24:18.995084 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7g4g\" (UniqueName: \"kubernetes.io/projected/56e82b67-9249-4e21-8da6-3138fddcff0e-kube-api-access-l7g4g\") pod \"56e82b67-9249-4e21-8da6-3138fddcff0e\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " Mar 18 18:24:18 crc kubenswrapper[5008]: I0318 18:24:18.995141 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data\") pod \"56e82b67-9249-4e21-8da6-3138fddcff0e\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " Mar 18 18:24:18 crc kubenswrapper[5008]: I0318 18:24:18.995240 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e82b67-9249-4e21-8da6-3138fddcff0e-logs\") pod \"56e82b67-9249-4e21-8da6-3138fddcff0e\" (UID: \"56e82b67-9249-4e21-8da6-3138fddcff0e\") " Mar 18 18:24:18 crc kubenswrapper[5008]: I0318 18:24:18.996130 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e82b67-9249-4e21-8da6-3138fddcff0e-logs" (OuterVolumeSpecName: "logs") pod "56e82b67-9249-4e21-8da6-3138fddcff0e" (UID: "56e82b67-9249-4e21-8da6-3138fddcff0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.006263 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "56e82b67-9249-4e21-8da6-3138fddcff0e" (UID: "56e82b67-9249-4e21-8da6-3138fddcff0e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.027534 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e82b67-9249-4e21-8da6-3138fddcff0e-kube-api-access-l7g4g" (OuterVolumeSpecName: "kube-api-access-l7g4g") pod "56e82b67-9249-4e21-8da6-3138fddcff0e" (UID: "56e82b67-9249-4e21-8da6-3138fddcff0e"). InnerVolumeSpecName "kube-api-access-l7g4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.055287 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56e82b67-9249-4e21-8da6-3138fddcff0e" (UID: "56e82b67-9249-4e21-8da6-3138fddcff0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.069970 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data" (OuterVolumeSpecName: "config-data") pod "56e82b67-9249-4e21-8da6-3138fddcff0e" (UID: "56e82b67-9249-4e21-8da6-3138fddcff0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.097294 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7g4g\" (UniqueName: \"kubernetes.io/projected/56e82b67-9249-4e21-8da6-3138fddcff0e-kube-api-access-l7g4g\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.097346 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.097357 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e82b67-9249-4e21-8da6-3138fddcff0e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.097370 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.097380 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56e82b67-9249-4e21-8da6-3138fddcff0e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.302739 5008 generic.go:334] "Generic (PLEG): container finished" podID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerID="4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9" exitCode=0 Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.302789 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d95565c88-9t6tp" event={"ID":"56e82b67-9249-4e21-8da6-3138fddcff0e","Type":"ContainerDied","Data":"4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9"} Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.302820 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d95565c88-9t6tp" event={"ID":"56e82b67-9249-4e21-8da6-3138fddcff0e","Type":"ContainerDied","Data":"53ee03fba85a231bc7f7757dc77a8584100d0d7d249d78bce5743c4bfdb7f845"} Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.302840 5008 scope.go:117] "RemoveContainer" containerID="4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.303011 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d95565c88-9t6tp" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.339027 5008 scope.go:117] "RemoveContainer" containerID="aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.349979 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d95565c88-9t6tp"] Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.357461 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7d95565c88-9t6tp"] Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.360365 5008 scope.go:117] "RemoveContainer" containerID="4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9" Mar 18 18:24:19 crc kubenswrapper[5008]: E0318 18:24:19.361939 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9\": container with ID starting with 4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9 not found: ID does not exist" containerID="4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.361996 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9"} err="failed to get container status \"4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9\": rpc error: code = NotFound desc = could not find container \"4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9\": container with ID starting with 4555fe554151fe69d6b14d9faffe5d02d326ed625bba17df8b18c0c2a4b791e9 not found: ID does not exist" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.362041 5008 scope.go:117] "RemoveContainer" containerID="aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299" Mar 18 18:24:19 crc kubenswrapper[5008]: E0318 18:24:19.362541 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299\": container with ID starting with aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299 not found: ID does not exist" containerID="aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.362588 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299"} err="failed to get container status \"aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299\": rpc error: code = NotFound desc = could not find container \"aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299\": container with ID starting with aae46bfd89d721620ddc010c5a10e41b098e49f0d05df1eb4451aab34792e299 not found: ID does not exist" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.534019 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.639728 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.699852 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-s6tln"] Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.705946 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" podUID="46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" containerName="dnsmasq-dns" containerID="cri-o://7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d" gracePeriod=10 Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.974897 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 18:24:19 crc kubenswrapper[5008]: E0318 18:24:19.975238 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api-log" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.975253 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api-log" Mar 18 18:24:19 crc kubenswrapper[5008]: E0318 18:24:19.975279 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.975286 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.975447 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.975475 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" containerName="barbican-api-log" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.975998 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.979613 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.979828 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.979938 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-msgvv" Mar 18 18:24:19 crc kubenswrapper[5008]: I0318 18:24:19.992168 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.120781 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config-secret\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.120853 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.120874 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.120896 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48tbz\" (UniqueName: \"kubernetes.io/projected/62a9d38a-89c4-4029-bdca-d828e3c1d24a-kube-api-access-48tbz\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.212943 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e82b67-9249-4e21-8da6-3138fddcff0e" path="/var/lib/kubelet/pods/56e82b67-9249-4e21-8da6-3138fddcff0e/volumes" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.222711 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config-secret\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.222783 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.222807 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.222832 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48tbz\" (UniqueName: \"kubernetes.io/projected/62a9d38a-89c4-4029-bdca-d828e3c1d24a-kube-api-access-48tbz\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.223750 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.228755 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config-secret\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.231141 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.238075 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48tbz\" (UniqueName: \"kubernetes.io/projected/62a9d38a-89c4-4029-bdca-d828e3c1d24a-kube-api-access-48tbz\") pod \"openstackclient\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.295822 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.358024 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.361647 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.411818 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.450223 5008 generic.go:334] "Generic (PLEG): container finished" podID="46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" containerID="7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d" exitCode=0 Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.450271 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" event={"ID":"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3","Type":"ContainerDied","Data":"7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d"} Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.450300 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" event={"ID":"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3","Type":"ContainerDied","Data":"26778168550e71b340cf3d82a02a64909b4b55ca7faa02c7bca2238e35a4d77e"} Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.450340 5008 scope.go:117] "RemoveContainer" containerID="7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.502128 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 18:24:20 crc kubenswrapper[5008]: E0318 18:24:20.527213 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" containerName="init" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.527256 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" containerName="init" Mar 18 18:24:20 crc kubenswrapper[5008]: E0318 18:24:20.527307 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" containerName="dnsmasq-dns" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.527317 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" containerName="dnsmasq-dns" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.528048 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" containerName="dnsmasq-dns" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.529041 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.529153 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.533051 5008 scope.go:117] "RemoveContainer" containerID="fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.537259 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-svc\") pod \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.537316 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-sb\") pod \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.537395 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-nb\") pod \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.537430 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls586\" (UniqueName: \"kubernetes.io/projected/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-kube-api-access-ls586\") pod \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.537487 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-swift-storage-0\") pod \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.537572 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-config\") pod \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\" (UID: \"46a99c6a-d6c7-4f06-8e19-9370cc0fbee3\") " Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.544449 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-kube-api-access-ls586" (OuterVolumeSpecName: "kube-api-access-ls586") pod "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" (UID: "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3"). InnerVolumeSpecName "kube-api-access-ls586". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.568018 5008 scope.go:117] "RemoveContainer" containerID="7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d" Mar 18 18:24:20 crc kubenswrapper[5008]: E0318 18:24:20.578834 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d\": container with ID starting with 7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d not found: ID does not exist" containerID="7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.578883 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d"} err="failed to get container status \"7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d\": rpc error: code = NotFound desc = could not find container \"7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d\": container with ID starting with 7621bb9c665d645203bf541bba3368fa0f57450cadeaaf492451dd12c5b1a75d not found: ID does not exist" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.578909 5008 scope.go:117] "RemoveContainer" containerID="fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1" Mar 18 18:24:20 crc kubenswrapper[5008]: E0318 18:24:20.584963 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1\": container with ID starting with fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1 not found: ID does not exist" containerID="fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.584995 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1"} err="failed to get container status \"fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1\": rpc error: code = NotFound desc = could not find container \"fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1\": container with ID starting with fc81abb687b305290157c81fcd0e5318de51c2a8a697f416846877ce529119b1 not found: ID does not exist" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.604890 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" (UID: "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.609722 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" (UID: "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.619590 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" (UID: "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:20 crc kubenswrapper[5008]: E0318 18:24:20.622242 5008 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 18:24:20 crc kubenswrapper[5008]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_62a9d38a-89c4-4029-bdca-d828e3c1d24a_0(70e54deb08d9bb5b87553a2a6bdd425b9bf8c73aea4d6fed85d1d33ec12e9c6d): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"70e54deb08d9bb5b87553a2a6bdd425b9bf8c73aea4d6fed85d1d33ec12e9c6d" Netns:"/var/run/netns/44164220-ee22-4ee0-bfeb-dfd9c91e60f4" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=70e54deb08d9bb5b87553a2a6bdd425b9bf8c73aea4d6fed85d1d33ec12e9c6d;K8S_POD_UID=62a9d38a-89c4-4029-bdca-d828e3c1d24a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/62a9d38a-89c4-4029-bdca-d828e3c1d24a]: expected pod UID "62a9d38a-89c4-4029-bdca-d828e3c1d24a" but got "1e0cfb4d-8438-45bc-882c-27c9544b40a5" from Kube API Mar 18 18:24:20 crc kubenswrapper[5008]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 18:24:20 crc kubenswrapper[5008]: > Mar 18 18:24:20 crc kubenswrapper[5008]: E0318 18:24:20.622349 5008 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 18:24:20 crc kubenswrapper[5008]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_62a9d38a-89c4-4029-bdca-d828e3c1d24a_0(70e54deb08d9bb5b87553a2a6bdd425b9bf8c73aea4d6fed85d1d33ec12e9c6d): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"70e54deb08d9bb5b87553a2a6bdd425b9bf8c73aea4d6fed85d1d33ec12e9c6d" Netns:"/var/run/netns/44164220-ee22-4ee0-bfeb-dfd9c91e60f4" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=70e54deb08d9bb5b87553a2a6bdd425b9bf8c73aea4d6fed85d1d33ec12e9c6d;K8S_POD_UID=62a9d38a-89c4-4029-bdca-d828e3c1d24a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/62a9d38a-89c4-4029-bdca-d828e3c1d24a]: expected pod UID "62a9d38a-89c4-4029-bdca-d828e3c1d24a" but got "1e0cfb4d-8438-45bc-882c-27c9544b40a5" from Kube API Mar 18 18:24:20 crc kubenswrapper[5008]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 18:24:20 crc kubenswrapper[5008]: > pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.623766 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" (UID: "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.633406 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-config" (OuterVolumeSpecName: "config") pod "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" (UID: "46a99c6a-d6c7-4f06-8e19-9370cc0fbee3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.639678 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config-secret\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.639757 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgtl8\" (UniqueName: \"kubernetes.io/projected/1e0cfb4d-8438-45bc-882c-27c9544b40a5-kube-api-access-xgtl8\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.640010 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.640069 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.640137 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.640154 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls586\" (UniqueName: \"kubernetes.io/projected/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-kube-api-access-ls586\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.640165 5008 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.640174 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.640184 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.640192 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.741365 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config-secret\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.741616 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgtl8\" (UniqueName: \"kubernetes.io/projected/1e0cfb4d-8438-45bc-882c-27c9544b40a5-kube-api-access-xgtl8\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.741725 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.741838 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.742568 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.744948 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config-secret\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.746963 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:20 crc kubenswrapper[5008]: I0318 18:24:20.761469 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgtl8\" (UniqueName: \"kubernetes.io/projected/1e0cfb4d-8438-45bc-882c-27c9544b40a5-kube-api-access-xgtl8\") pod \"openstackclient\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " pod="openstack/openstackclient" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.017860 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.470918 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.471433 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.471508 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8b7b7d5-s6tln" Mar 18 18:24:21 crc kubenswrapper[5008]: W0318 18:24:21.476885 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e0cfb4d_8438_45bc_882c_27c9544b40a5.slice/crio-1073dfe0dd4592791997a47f8192ec9374c3cf92f62f731b020feed22e0fadc8 WatchSource:0}: Error finding container 1073dfe0dd4592791997a47f8192ec9374c3cf92f62f731b020feed22e0fadc8: Status 404 returned error can't find the container with id 1073dfe0dd4592791997a47f8192ec9374c3cf92f62f731b020feed22e0fadc8 Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.479955 5008 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="62a9d38a-89c4-4029-bdca-d828e3c1d24a" podUID="1e0cfb4d-8438-45bc-882c-27c9544b40a5" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.485602 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.507590 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-s6tln"] Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.514663 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d8b7b7d5-s6tln"] Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.559482 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-combined-ca-bundle\") pod \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.560535 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48tbz\" (UniqueName: \"kubernetes.io/projected/62a9d38a-89c4-4029-bdca-d828e3c1d24a-kube-api-access-48tbz\") pod \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.560773 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config\") pod \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.560889 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config-secret\") pod \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\" (UID: \"62a9d38a-89c4-4029-bdca-d828e3c1d24a\") " Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.561366 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "62a9d38a-89c4-4029-bdca-d828e3c1d24a" (UID: "62a9d38a-89c4-4029-bdca-d828e3c1d24a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.561823 5008 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.564602 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a9d38a-89c4-4029-bdca-d828e3c1d24a-kube-api-access-48tbz" (OuterVolumeSpecName: "kube-api-access-48tbz") pod "62a9d38a-89c4-4029-bdca-d828e3c1d24a" (UID: "62a9d38a-89c4-4029-bdca-d828e3c1d24a"). InnerVolumeSpecName "kube-api-access-48tbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.564606 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62a9d38a-89c4-4029-bdca-d828e3c1d24a" (UID: "62a9d38a-89c4-4029-bdca-d828e3c1d24a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.567092 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "62a9d38a-89c4-4029-bdca-d828e3c1d24a" (UID: "62a9d38a-89c4-4029-bdca-d828e3c1d24a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.663701 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.663748 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48tbz\" (UniqueName: \"kubernetes.io/projected/62a9d38a-89c4-4029-bdca-d828e3c1d24a-kube-api-access-48tbz\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:21 crc kubenswrapper[5008]: I0318 18:24:21.663760 5008 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/62a9d38a-89c4-4029-bdca-d828e3c1d24a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:22 crc kubenswrapper[5008]: I0318 18:24:22.209848 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a99c6a-d6c7-4f06-8e19-9370cc0fbee3" path="/var/lib/kubelet/pods/46a99c6a-d6c7-4f06-8e19-9370cc0fbee3/volumes" Mar 18 18:24:22 crc kubenswrapper[5008]: I0318 18:24:22.211537 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a9d38a-89c4-4029-bdca-d828e3c1d24a" path="/var/lib/kubelet/pods/62a9d38a-89c4-4029-bdca-d828e3c1d24a/volumes" Mar 18 18:24:22 crc kubenswrapper[5008]: I0318 18:24:22.482533 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:24:22 crc kubenswrapper[5008]: I0318 18:24:22.482593 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1e0cfb4d-8438-45bc-882c-27c9544b40a5","Type":"ContainerStarted","Data":"1073dfe0dd4592791997a47f8192ec9374c3cf92f62f731b020feed22e0fadc8"} Mar 18 18:24:22 crc kubenswrapper[5008]: I0318 18:24:22.488304 5008 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="62a9d38a-89c4-4029-bdca-d828e3c1d24a" podUID="1e0cfb4d-8438-45bc-882c-27c9544b40a5" Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.463504 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.463912 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.748830 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.805962 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.856460 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7cc79d78dc-6z4kh"] Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.858508 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.860843 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.861053 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.861318 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 18:24:24 crc kubenswrapper[5008]: I0318 18:24:24.868414 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7cc79d78dc-6z4kh"] Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.031087 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-log-httpd\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.031587 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-public-tls-certs\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.031708 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-internal-tls-certs\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.031789 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-etc-swift\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.031888 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-run-httpd\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.031970 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-combined-ca-bundle\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.032059 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-config-data\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.032159 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4b8l\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-kube-api-access-b4b8l\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.134333 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-config-data\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.134413 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4b8l\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-kube-api-access-b4b8l\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.134507 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-log-httpd\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.134545 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-public-tls-certs\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.134595 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-internal-tls-certs\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.134620 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-etc-swift\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.134665 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-run-httpd\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.134696 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-combined-ca-bundle\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.135260 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-run-httpd\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.135350 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-log-httpd\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.142006 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-public-tls-certs\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.142243 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-config-data\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.143207 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-internal-tls-certs\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.147891 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-combined-ca-bundle\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.155325 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4b8l\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-kube-api-access-b4b8l\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.167894 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-etc-swift\") pod \"swift-proxy-7cc79d78dc-6z4kh\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.180027 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.508639 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerName="cinder-scheduler" containerID="cri-o://3b7cc2f123e9419d8f9bfa02e53afcebdc40cf75753a61d13baac48536d1448e" gracePeriod=30 Mar 18 18:24:25 crc kubenswrapper[5008]: I0318 18:24:25.508717 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerName="probe" containerID="cri-o://07944dbb6c5525e196bf06fbf04ef2b9327d96f150930ea3cab6c02daf6619d9" gracePeriod=30 Mar 18 18:24:26 crc kubenswrapper[5008]: I0318 18:24:26.522881 5008 generic.go:334] "Generic (PLEG): container finished" podID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerID="07944dbb6c5525e196bf06fbf04ef2b9327d96f150930ea3cab6c02daf6619d9" exitCode=0 Mar 18 18:24:26 crc kubenswrapper[5008]: I0318 18:24:26.522950 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9cb5e1e-6917-4864-9b33-ed7de36e11ec","Type":"ContainerDied","Data":"07944dbb6c5525e196bf06fbf04ef2b9327d96f150930ea3cab6c02daf6619d9"} Mar 18 18:24:26 crc kubenswrapper[5008]: I0318 18:24:26.623233 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 18:24:26 crc kubenswrapper[5008]: I0318 18:24:26.841759 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:26 crc kubenswrapper[5008]: I0318 18:24:26.842208 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="ceilometer-notification-agent" containerID="cri-o://2c04800848f603732220b19ce0fce3517825fd4a6b4652f7d4e4e4b262be4ed7" gracePeriod=30 Mar 18 18:24:26 crc kubenswrapper[5008]: I0318 18:24:26.842191 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="sg-core" containerID="cri-o://37b3097e7eb31a482b5d14557b6615ab2d05394d59ca4e08adba6e10955b1241" gracePeriod=30 Mar 18 18:24:26 crc kubenswrapper[5008]: I0318 18:24:26.842366 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="proxy-httpd" containerID="cri-o://461c5f3e2f88a0958c6cabde8b431b1909ea245780a93267a9dddb0c4b8bd6ee" gracePeriod=30 Mar 18 18:24:26 crc kubenswrapper[5008]: I0318 18:24:26.846636 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="ceilometer-central-agent" containerID="cri-o://0ccf73214d6197a655f3da448eb3396b2393efd78a8580f0b1a9df56d850bb15" gracePeriod=30 Mar 18 18:24:26 crc kubenswrapper[5008]: I0318 18:24:26.949694 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": read tcp 10.217.0.2:60320->10.217.0.162:3000: read: connection reset by peer" Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.540953 5008 generic.go:334] "Generic (PLEG): container finished" podID="6954c220-010e-4046-8592-192a131fe488" containerID="461c5f3e2f88a0958c6cabde8b431b1909ea245780a93267a9dddb0c4b8bd6ee" exitCode=0 Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.541431 5008 generic.go:334] "Generic (PLEG): container finished" podID="6954c220-010e-4046-8592-192a131fe488" containerID="37b3097e7eb31a482b5d14557b6615ab2d05394d59ca4e08adba6e10955b1241" exitCode=2 Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.541443 5008 generic.go:334] "Generic (PLEG): container finished" podID="6954c220-010e-4046-8592-192a131fe488" containerID="2c04800848f603732220b19ce0fce3517825fd4a6b4652f7d4e4e4b262be4ed7" exitCode=0 Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.541449 5008 generic.go:334] "Generic (PLEG): container finished" podID="6954c220-010e-4046-8592-192a131fe488" containerID="0ccf73214d6197a655f3da448eb3396b2393efd78a8580f0b1a9df56d850bb15" exitCode=0 Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.541033 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerDied","Data":"461c5f3e2f88a0958c6cabde8b431b1909ea245780a93267a9dddb0c4b8bd6ee"} Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.541510 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerDied","Data":"37b3097e7eb31a482b5d14557b6615ab2d05394d59ca4e08adba6e10955b1241"} Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.541526 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerDied","Data":"2c04800848f603732220b19ce0fce3517825fd4a6b4652f7d4e4e4b262be4ed7"} Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.541536 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerDied","Data":"0ccf73214d6197a655f3da448eb3396b2393efd78a8580f0b1a9df56d850bb15"} Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.544414 5008 generic.go:334] "Generic (PLEG): container finished" podID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerID="3b7cc2f123e9419d8f9bfa02e53afcebdc40cf75753a61d13baac48536d1448e" exitCode=0 Mar 18 18:24:27 crc kubenswrapper[5008]: I0318 18:24:27.544449 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9cb5e1e-6917-4864-9b33-ed7de36e11ec","Type":"ContainerDied","Data":"3b7cc2f123e9419d8f9bfa02e53afcebdc40cf75753a61d13baac48536d1448e"} Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.104443 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.114321 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142058 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-combined-ca-bundle\") pod \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142133 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data\") pod \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142182 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-combined-ca-bundle\") pod \"6954c220-010e-4046-8592-192a131fe488\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142304 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b9kn\" (UniqueName: \"kubernetes.io/projected/6954c220-010e-4046-8592-192a131fe488-kube-api-access-4b9kn\") pod \"6954c220-010e-4046-8592-192a131fe488\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142343 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-config-data\") pod \"6954c220-010e-4046-8592-192a131fe488\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142401 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-scripts\") pod \"6954c220-010e-4046-8592-192a131fe488\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142436 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-etc-machine-id\") pod \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142475 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-log-httpd\") pod \"6954c220-010e-4046-8592-192a131fe488\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142494 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-scripts\") pod \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142541 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-run-httpd\") pod \"6954c220-010e-4046-8592-192a131fe488\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142603 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data-custom\") pod \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142627 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s75qn\" (UniqueName: \"kubernetes.io/projected/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-kube-api-access-s75qn\") pod \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\" (UID: \"c9cb5e1e-6917-4864-9b33-ed7de36e11ec\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.142655 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-sg-core-conf-yaml\") pod \"6954c220-010e-4046-8592-192a131fe488\" (UID: \"6954c220-010e-4046-8592-192a131fe488\") " Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.144133 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6954c220-010e-4046-8592-192a131fe488" (UID: "6954c220-010e-4046-8592-192a131fe488"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.144487 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6954c220-010e-4046-8592-192a131fe488" (UID: "6954c220-010e-4046-8592-192a131fe488"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.145178 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c9cb5e1e-6917-4864-9b33-ed7de36e11ec" (UID: "c9cb5e1e-6917-4864-9b33-ed7de36e11ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.148300 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-scripts" (OuterVolumeSpecName: "scripts") pod "c9cb5e1e-6917-4864-9b33-ed7de36e11ec" (UID: "c9cb5e1e-6917-4864-9b33-ed7de36e11ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.156634 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-kube-api-access-s75qn" (OuterVolumeSpecName: "kube-api-access-s75qn") pod "c9cb5e1e-6917-4864-9b33-ed7de36e11ec" (UID: "c9cb5e1e-6917-4864-9b33-ed7de36e11ec"). InnerVolumeSpecName "kube-api-access-s75qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.158773 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6954c220-010e-4046-8592-192a131fe488-kube-api-access-4b9kn" (OuterVolumeSpecName: "kube-api-access-4b9kn") pod "6954c220-010e-4046-8592-192a131fe488" (UID: "6954c220-010e-4046-8592-192a131fe488"). InnerVolumeSpecName "kube-api-access-4b9kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.164539 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c9cb5e1e-6917-4864-9b33-ed7de36e11ec" (UID: "c9cb5e1e-6917-4864-9b33-ed7de36e11ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.166802 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-scripts" (OuterVolumeSpecName: "scripts") pod "6954c220-010e-4046-8592-192a131fe488" (UID: "6954c220-010e-4046-8592-192a131fe488"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.188421 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6954c220-010e-4046-8592-192a131fe488" (UID: "6954c220-010e-4046-8592-192a131fe488"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244370 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244396 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s75qn\" (UniqueName: \"kubernetes.io/projected/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-kube-api-access-s75qn\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244408 5008 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244418 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b9kn\" (UniqueName: \"kubernetes.io/projected/6954c220-010e-4046-8592-192a131fe488-kube-api-access-4b9kn\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244503 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244515 5008 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244523 5008 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244531 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244540 5008 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6954c220-010e-4046-8592-192a131fe488-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.244761 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9cb5e1e-6917-4864-9b33-ed7de36e11ec" (UID: "c9cb5e1e-6917-4864-9b33-ed7de36e11ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.254686 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-config-data" (OuterVolumeSpecName: "config-data") pod "6954c220-010e-4046-8592-192a131fe488" (UID: "6954c220-010e-4046-8592-192a131fe488"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.271293 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6954c220-010e-4046-8592-192a131fe488" (UID: "6954c220-010e-4046-8592-192a131fe488"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.273693 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data" (OuterVolumeSpecName: "config-data") pod "c9cb5e1e-6917-4864-9b33-ed7de36e11ec" (UID: "c9cb5e1e-6917-4864-9b33-ed7de36e11ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.326829 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7cc79d78dc-6z4kh"] Mar 18 18:24:31 crc kubenswrapper[5008]: W0318 18:24:31.332175 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3448bfb_b04e_4f59_b275_45ae07178640.slice/crio-1b4706de9697d4f4dc77da13cd77bdefb9eaad94681e6e2726fbf0b890e52c0a WatchSource:0}: Error finding container 1b4706de9697d4f4dc77da13cd77bdefb9eaad94681e6e2726fbf0b890e52c0a: Status 404 returned error can't find the container with id 1b4706de9697d4f4dc77da13cd77bdefb9eaad94681e6e2726fbf0b890e52c0a Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.345607 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.346050 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cb5e1e-6917-4864-9b33-ed7de36e11ec-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.346107 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.346183 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6954c220-010e-4046-8592-192a131fe488-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.586845 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.588162 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9cb5e1e-6917-4864-9b33-ed7de36e11ec","Type":"ContainerDied","Data":"fc97f8a2bfe1c9976c4bc8afae44bc7eeb6c295dd2ff5841e43db3d0f13e0746"} Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.588887 5008 scope.go:117] "RemoveContainer" containerID="07944dbb6c5525e196bf06fbf04ef2b9327d96f150930ea3cab6c02daf6619d9" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.593533 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1e0cfb4d-8438-45bc-882c-27c9544b40a5","Type":"ContainerStarted","Data":"3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959"} Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.610549 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" event={"ID":"e3448bfb-b04e-4f59-b275-45ae07178640","Type":"ContainerStarted","Data":"868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307"} Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.610605 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" event={"ID":"e3448bfb-b04e-4f59-b275-45ae07178640","Type":"ContainerStarted","Data":"1b4706de9697d4f4dc77da13cd77bdefb9eaad94681e6e2726fbf0b890e52c0a"} Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.614500 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.614467 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6954c220-010e-4046-8592-192a131fe488","Type":"ContainerDied","Data":"77f7af1430a5f5b658ae5594806c415cf068d71b17f4bb76df24cc3cc79eb776"} Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.681231 5008 scope.go:117] "RemoveContainer" containerID="3b7cc2f123e9419d8f9bfa02e53afcebdc40cf75753a61d13baac48536d1448e" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.702671 5008 scope.go:117] "RemoveContainer" containerID="461c5f3e2f88a0958c6cabde8b431b1909ea245780a93267a9dddb0c4b8bd6ee" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.716628 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.447682035 podStartE2EDuration="11.716612562s" podCreationTimestamp="2026-03-18 18:24:20 +0000 UTC" firstStartedPulling="2026-03-18 18:24:21.479732665 +0000 UTC m=+1317.999205744" lastFinishedPulling="2026-03-18 18:24:30.748663192 +0000 UTC m=+1327.268136271" observedRunningTime="2026-03-18 18:24:31.610773189 +0000 UTC m=+1328.130246268" watchObservedRunningTime="2026-03-18 18:24:31.716612562 +0000 UTC m=+1328.236085641" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.717769 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.725469 5008 scope.go:117] "RemoveContainer" containerID="37b3097e7eb31a482b5d14557b6615ab2d05394d59ca4e08adba6e10955b1241" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.726614 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.740726 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.747755 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.758865 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:31 crc kubenswrapper[5008]: E0318 18:24:31.759233 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="sg-core" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759249 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="sg-core" Mar 18 18:24:31 crc kubenswrapper[5008]: E0318 18:24:31.759262 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerName="probe" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759268 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerName="probe" Mar 18 18:24:31 crc kubenswrapper[5008]: E0318 18:24:31.759290 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="ceilometer-notification-agent" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759297 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="ceilometer-notification-agent" Mar 18 18:24:31 crc kubenswrapper[5008]: E0318 18:24:31.759306 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerName="cinder-scheduler" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759312 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerName="cinder-scheduler" Mar 18 18:24:31 crc kubenswrapper[5008]: E0318 18:24:31.759321 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="proxy-httpd" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759326 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="proxy-httpd" Mar 18 18:24:31 crc kubenswrapper[5008]: E0318 18:24:31.759346 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="ceilometer-central-agent" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759352 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="ceilometer-central-agent" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759516 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerName="probe" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759531 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="sg-core" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759541 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="ceilometer-notification-agent" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759564 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="proxy-httpd" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759576 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6954c220-010e-4046-8592-192a131fe488" containerName="ceilometer-central-agent" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.759584 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" containerName="cinder-scheduler" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.761229 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.763976 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.767580 5008 scope.go:117] "RemoveContainer" containerID="2c04800848f603732220b19ce0fce3517825fd4a6b4652f7d4e4e4b262be4ed7" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.765373 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.768349 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.783381 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.785117 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.794758 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.814405 5008 scope.go:117] "RemoveContainer" containerID="0ccf73214d6197a655f3da448eb3396b2393efd78a8580f0b1a9df56d850bb15" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.817514 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.862827 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-log-httpd\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.862909 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.862947 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-scripts\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.862997 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.863029 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-run-httpd\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.863060 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.863101 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mnkq\" (UniqueName: \"kubernetes.io/projected/4b698032-cab1-4d1d-85f2-846e38fbceba-kube-api-access-4mnkq\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.863123 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.863146 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-config-data\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.863188 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-scripts\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.863211 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.863239 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.863276 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhzsq\" (UniqueName: \"kubernetes.io/projected/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-kube-api-access-lhzsq\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965385 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965454 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-scripts\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965503 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965539 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-run-httpd\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965605 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965644 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mnkq\" (UniqueName: \"kubernetes.io/projected/4b698032-cab1-4d1d-85f2-846e38fbceba-kube-api-access-4mnkq\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965669 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965693 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-config-data\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965736 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-scripts\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965759 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965791 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965828 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhzsq\" (UniqueName: \"kubernetes.io/projected/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-kube-api-access-lhzsq\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.965856 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-log-httpd\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.966475 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-log-httpd\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.966731 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.970506 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-run-httpd\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.977035 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.978243 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.978645 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.980777 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-scripts\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.982247 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.985230 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-config-data\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.989350 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mnkq\" (UniqueName: \"kubernetes.io/projected/4b698032-cab1-4d1d-85f2-846e38fbceba-kube-api-access-4mnkq\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.990265 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:31 crc kubenswrapper[5008]: I0318 18:24:31.999543 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-scripts\") pod \"ceilometer-0\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " pod="openstack/ceilometer-0" Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.000585 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhzsq\" (UniqueName: \"kubernetes.io/projected/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-kube-api-access-lhzsq\") pod \"cinder-scheduler-0\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " pod="openstack/cinder-scheduler-0" Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.087360 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.143020 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.208996 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6954c220-010e-4046-8592-192a131fe488" path="/var/lib/kubelet/pods/6954c220-010e-4046-8592-192a131fe488/volumes" Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.209719 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cb5e1e-6917-4864-9b33-ed7de36e11ec" path="/var/lib/kubelet/pods/c9cb5e1e-6917-4864-9b33-ed7de36e11ec/volumes" Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.583662 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.624306 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" event={"ID":"e3448bfb-b04e-4f59-b275-45ae07178640","Type":"ContainerStarted","Data":"e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781"} Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.624489 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.629872 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerStarted","Data":"85f38a652c9601356e549499bf715ef2303b86a0b3a1a3e59a64b62581673cd7"} Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.644584 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" podStartSLOduration=8.644542558 podStartE2EDuration="8.644542558s" podCreationTimestamp="2026-03-18 18:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:32.643040149 +0000 UTC m=+1329.162513228" watchObservedRunningTime="2026-03-18 18:24:32.644542558 +0000 UTC m=+1329.164015657" Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.677445 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:24:32 crc kubenswrapper[5008]: I0318 18:24:32.893577 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:33 crc kubenswrapper[5008]: I0318 18:24:33.648222 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9","Type":"ContainerStarted","Data":"20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236"} Mar 18 18:24:33 crc kubenswrapper[5008]: I0318 18:24:33.648766 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9","Type":"ContainerStarted","Data":"8cdd24a0ab60621ebf8f6f65dc159f5b4b227aee6053095397ed6a70abcea49f"} Mar 18 18:24:33 crc kubenswrapper[5008]: I0318 18:24:33.648822 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:35 crc kubenswrapper[5008]: I0318 18:24:35.666649 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerStarted","Data":"c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d"} Mar 18 18:24:35 crc kubenswrapper[5008]: I0318 18:24:35.669875 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9","Type":"ContainerStarted","Data":"2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941"} Mar 18 18:24:35 crc kubenswrapper[5008]: I0318 18:24:35.692170 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.692150916 podStartE2EDuration="4.692150916s" podCreationTimestamp="2026-03-18 18:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:35.686092897 +0000 UTC m=+1332.205565976" watchObservedRunningTime="2026-03-18 18:24:35.692150916 +0000 UTC m=+1332.211623995" Mar 18 18:24:36 crc kubenswrapper[5008]: I0318 18:24:36.977530 5008 scope.go:117] "RemoveContainer" containerID="0f52ebe4b101a8a1011a40ad89e5922e116ee42a53b3bcfa84275658fd556111" Mar 18 18:24:37 crc kubenswrapper[5008]: I0318 18:24:37.143914 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 18:24:37 crc kubenswrapper[5008]: I0318 18:24:37.689456 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerStarted","Data":"007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05"} Mar 18 18:24:37 crc kubenswrapper[5008]: I0318 18:24:37.689524 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerStarted","Data":"7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd"} Mar 18 18:24:39 crc kubenswrapper[5008]: I0318 18:24:39.482866 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:24:39 crc kubenswrapper[5008]: I0318 18:24:39.546630 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bfd8598c6-wqfkp"] Mar 18 18:24:39 crc kubenswrapper[5008]: I0318 18:24:39.546917 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bfd8598c6-wqfkp" podUID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerName="neutron-api" containerID="cri-o://610e5ea90ac2ed7181ffc28f663e96834799eed0a2d6a49084894caa9dd208f6" gracePeriod=30 Mar 18 18:24:39 crc kubenswrapper[5008]: I0318 18:24:39.547101 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bfd8598c6-wqfkp" podUID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerName="neutron-httpd" containerID="cri-o://db9b0e64ba2d9d784f3f83ded3ad1d7fc71e671670340db39bbbfe4e5d0921c0" gracePeriod=30 Mar 18 18:24:39 crc kubenswrapper[5008]: I0318 18:24:39.716906 5008 generic.go:334] "Generic (PLEG): container finished" podID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerID="db9b0e64ba2d9d784f3f83ded3ad1d7fc71e671670340db39bbbfe4e5d0921c0" exitCode=0 Mar 18 18:24:39 crc kubenswrapper[5008]: I0318 18:24:39.717129 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bfd8598c6-wqfkp" event={"ID":"89496eaf-50d1-45a6-802f-f127c3766a3b","Type":"ContainerDied","Data":"db9b0e64ba2d9d784f3f83ded3ad1d7fc71e671670340db39bbbfe4e5d0921c0"} Mar 18 18:24:40 crc kubenswrapper[5008]: I0318 18:24:40.189350 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:40 crc kubenswrapper[5008]: I0318 18:24:40.194638 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.247152 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.247957 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerName="glance-log" containerID="cri-o://af73e056114a2a33a709e5c17ac2da30e70d25f0b9ec21b66200b676e70c1886" gracePeriod=30 Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.248041 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerName="glance-httpd" containerID="cri-o://02df5e9b1d3df7f9943325c9d6500ce870862dd110d6174a71739dbe225d9cb4" gracePeriod=30 Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.747454 5008 generic.go:334] "Generic (PLEG): container finished" podID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerID="af73e056114a2a33a709e5c17ac2da30e70d25f0b9ec21b66200b676e70c1886" exitCode=143 Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.747813 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a8e3a3-716a-49ec-962e-25580e47f4a6","Type":"ContainerDied","Data":"af73e056114a2a33a709e5c17ac2da30e70d25f0b9ec21b66200b676e70c1886"} Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.755275 5008 generic.go:334] "Generic (PLEG): container finished" podID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerID="610e5ea90ac2ed7181ffc28f663e96834799eed0a2d6a49084894caa9dd208f6" exitCode=0 Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.755347 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bfd8598c6-wqfkp" event={"ID":"89496eaf-50d1-45a6-802f-f127c3766a3b","Type":"ContainerDied","Data":"610e5ea90ac2ed7181ffc28f663e96834799eed0a2d6a49084894caa9dd208f6"} Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.758829 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerStarted","Data":"d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37"} Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.759474 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.759025 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="proxy-httpd" containerID="cri-o://d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37" gracePeriod=30 Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.758998 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="ceilometer-central-agent" containerID="cri-o://c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d" gracePeriod=30 Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.759054 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="ceilometer-notification-agent" containerID="cri-o://7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd" gracePeriod=30 Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.759039 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="sg-core" containerID="cri-o://007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05" gracePeriod=30 Mar 18 18:24:41 crc kubenswrapper[5008]: I0318 18:24:41.799491 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.573326218 podStartE2EDuration="10.799466079s" podCreationTimestamp="2026-03-18 18:24:31 +0000 UTC" firstStartedPulling="2026-03-18 18:24:32.588374111 +0000 UTC m=+1329.107847190" lastFinishedPulling="2026-03-18 18:24:40.814513972 +0000 UTC m=+1337.333987051" observedRunningTime="2026-03-18 18:24:41.795501304 +0000 UTC m=+1338.314974383" watchObservedRunningTime="2026-03-18 18:24:41.799466079 +0000 UTC m=+1338.318939158" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.004596 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.052880 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-httpd-config\") pod \"89496eaf-50d1-45a6-802f-f127c3766a3b\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.053003 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-config\") pod \"89496eaf-50d1-45a6-802f-f127c3766a3b\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.053118 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-ovndb-tls-certs\") pod \"89496eaf-50d1-45a6-802f-f127c3766a3b\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.053239 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cjsf\" (UniqueName: \"kubernetes.io/projected/89496eaf-50d1-45a6-802f-f127c3766a3b-kube-api-access-5cjsf\") pod \"89496eaf-50d1-45a6-802f-f127c3766a3b\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.053343 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-combined-ca-bundle\") pod \"89496eaf-50d1-45a6-802f-f127c3766a3b\" (UID: \"89496eaf-50d1-45a6-802f-f127c3766a3b\") " Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.060866 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89496eaf-50d1-45a6-802f-f127c3766a3b-kube-api-access-5cjsf" (OuterVolumeSpecName: "kube-api-access-5cjsf") pod "89496eaf-50d1-45a6-802f-f127c3766a3b" (UID: "89496eaf-50d1-45a6-802f-f127c3766a3b"). InnerVolumeSpecName "kube-api-access-5cjsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.064711 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "89496eaf-50d1-45a6-802f-f127c3766a3b" (UID: "89496eaf-50d1-45a6-802f-f127c3766a3b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.108813 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89496eaf-50d1-45a6-802f-f127c3766a3b" (UID: "89496eaf-50d1-45a6-802f-f127c3766a3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.119955 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-config" (OuterVolumeSpecName: "config") pod "89496eaf-50d1-45a6-802f-f127c3766a3b" (UID: "89496eaf-50d1-45a6-802f-f127c3766a3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.151526 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "89496eaf-50d1-45a6-802f-f127c3766a3b" (UID: "89496eaf-50d1-45a6-802f-f127c3766a3b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.155794 5008 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.155818 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cjsf\" (UniqueName: \"kubernetes.io/projected/89496eaf-50d1-45a6-802f-f127c3766a3b-kube-api-access-5cjsf\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.155830 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.155840 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.155849 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/89496eaf-50d1-45a6-802f-f127c3766a3b-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.359882 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.767783 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bfd8598c6-wqfkp" event={"ID":"89496eaf-50d1-45a6-802f-f127c3766a3b","Type":"ContainerDied","Data":"8161022d056d5e6c08cc080eaa78d9c23395cbc52b9355004d16352597140a47"} Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.767828 5008 scope.go:117] "RemoveContainer" containerID="db9b0e64ba2d9d784f3f83ded3ad1d7fc71e671670340db39bbbfe4e5d0921c0" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.767938 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bfd8598c6-wqfkp" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.772422 5008 generic.go:334] "Generic (PLEG): container finished" podID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerID="d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37" exitCode=0 Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.772448 5008 generic.go:334] "Generic (PLEG): container finished" podID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerID="007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05" exitCode=2 Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.772456 5008 generic.go:334] "Generic (PLEG): container finished" podID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerID="7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd" exitCode=0 Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.772483 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerDied","Data":"d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37"} Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.772577 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerDied","Data":"007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05"} Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.772598 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerDied","Data":"7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd"} Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.790103 5008 scope.go:117] "RemoveContainer" containerID="610e5ea90ac2ed7181ffc28f663e96834799eed0a2d6a49084894caa9dd208f6" Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.794249 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bfd8598c6-wqfkp"] Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.807050 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5bfd8598c6-wqfkp"] Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.971866 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.972138 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerName="glance-log" containerID="cri-o://5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492" gracePeriod=30 Mar 18 18:24:42 crc kubenswrapper[5008]: I0318 18:24:42.972222 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerName="glance-httpd" containerID="cri-o://4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec" gracePeriod=30 Mar 18 18:24:43 crc kubenswrapper[5008]: I0318 18:24:43.788353 5008 generic.go:334] "Generic (PLEG): container finished" podID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerID="5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492" exitCode=143 Mar 18 18:24:43 crc kubenswrapper[5008]: I0318 18:24:43.788750 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d2a0f99-a606-4784-b6ca-9f6561d3cf93","Type":"ContainerDied","Data":"5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492"} Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.213701 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89496eaf-50d1-45a6-802f-f127c3766a3b" path="/var/lib/kubelet/pods/89496eaf-50d1-45a6-802f-f127c3766a3b/volumes" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.272126 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.393942 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-sg-core-conf-yaml\") pod \"4b698032-cab1-4d1d-85f2-846e38fbceba\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.394042 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-scripts\") pod \"4b698032-cab1-4d1d-85f2-846e38fbceba\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.394122 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-config-data\") pod \"4b698032-cab1-4d1d-85f2-846e38fbceba\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.394158 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-log-httpd\") pod \"4b698032-cab1-4d1d-85f2-846e38fbceba\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.394223 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-run-httpd\") pod \"4b698032-cab1-4d1d-85f2-846e38fbceba\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.394277 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-combined-ca-bundle\") pod \"4b698032-cab1-4d1d-85f2-846e38fbceba\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.394382 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mnkq\" (UniqueName: \"kubernetes.io/projected/4b698032-cab1-4d1d-85f2-846e38fbceba-kube-api-access-4mnkq\") pod \"4b698032-cab1-4d1d-85f2-846e38fbceba\" (UID: \"4b698032-cab1-4d1d-85f2-846e38fbceba\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.394984 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b698032-cab1-4d1d-85f2-846e38fbceba" (UID: "4b698032-cab1-4d1d-85f2-846e38fbceba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.395239 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b698032-cab1-4d1d-85f2-846e38fbceba" (UID: "4b698032-cab1-4d1d-85f2-846e38fbceba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.400664 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b698032-cab1-4d1d-85f2-846e38fbceba-kube-api-access-4mnkq" (OuterVolumeSpecName: "kube-api-access-4mnkq") pod "4b698032-cab1-4d1d-85f2-846e38fbceba" (UID: "4b698032-cab1-4d1d-85f2-846e38fbceba"). InnerVolumeSpecName "kube-api-access-4mnkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.400863 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-scripts" (OuterVolumeSpecName: "scripts") pod "4b698032-cab1-4d1d-85f2-846e38fbceba" (UID: "4b698032-cab1-4d1d-85f2-846e38fbceba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.427373 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b698032-cab1-4d1d-85f2-846e38fbceba" (UID: "4b698032-cab1-4d1d-85f2-846e38fbceba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.496824 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mnkq\" (UniqueName: \"kubernetes.io/projected/4b698032-cab1-4d1d-85f2-846e38fbceba-kube-api-access-4mnkq\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.496859 5008 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.496870 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.496879 5008 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.496892 5008 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b698032-cab1-4d1d-85f2-846e38fbceba-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.502927 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-config-data" (OuterVolumeSpecName: "config-data") pod "4b698032-cab1-4d1d-85f2-846e38fbceba" (UID: "4b698032-cab1-4d1d-85f2-846e38fbceba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.505133 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b698032-cab1-4d1d-85f2-846e38fbceba" (UID: "4b698032-cab1-4d1d-85f2-846e38fbceba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.597829 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.597990 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b698032-cab1-4d1d-85f2-846e38fbceba-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.814681 5008 generic.go:334] "Generic (PLEG): container finished" podID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerID="02df5e9b1d3df7f9943325c9d6500ce870862dd110d6174a71739dbe225d9cb4" exitCode=0 Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.814802 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a8e3a3-716a-49ec-962e-25580e47f4a6","Type":"ContainerDied","Data":"02df5e9b1d3df7f9943325c9d6500ce870862dd110d6174a71739dbe225d9cb4"} Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.818156 5008 generic.go:334] "Generic (PLEG): container finished" podID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerID="c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d" exitCode=0 Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.818213 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerDied","Data":"c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d"} Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.818243 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b698032-cab1-4d1d-85f2-846e38fbceba","Type":"ContainerDied","Data":"85f38a652c9601356e549499bf715ef2303b86a0b3a1a3e59a64b62581673cd7"} Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.818264 5008 scope.go:117] "RemoveContainer" containerID="d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.818326 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.857928 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.858320 5008 scope.go:117] "RemoveContainer" containerID="007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.870362 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.877852 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.891883 5008 scope.go:117] "RemoveContainer" containerID="7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.924639 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.925178 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerName="neutron-httpd" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925190 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerName="neutron-httpd" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.925225 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerName="neutron-api" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925233 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerName="neutron-api" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.925243 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="ceilometer-central-agent" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925252 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="ceilometer-central-agent" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.925260 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerName="glance-log" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925266 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerName="glance-log" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.925274 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="proxy-httpd" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925281 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="proxy-httpd" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.925293 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="ceilometer-notification-agent" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925299 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="ceilometer-notification-agent" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.925338 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="sg-core" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925344 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="sg-core" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.925355 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerName="glance-httpd" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925360 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerName="glance-httpd" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925529 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerName="neutron-api" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925546 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="ceilometer-central-agent" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925570 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="proxy-httpd" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925581 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="89496eaf-50d1-45a6-802f-f127c3766a3b" containerName="neutron-httpd" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925592 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="ceilometer-notification-agent" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925604 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerName="glance-httpd" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925613 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a8e3a3-716a-49ec-962e-25580e47f4a6" containerName="glance-log" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.925622 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" containerName="sg-core" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.927107 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.928700 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-combined-ca-bundle\") pod \"31a8e3a3-716a-49ec-962e-25580e47f4a6\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.928747 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"31a8e3a3-716a-49ec-962e-25580e47f4a6\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.928803 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-config-data\") pod \"31a8e3a3-716a-49ec-962e-25580e47f4a6\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.928884 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-httpd-run\") pod \"31a8e3a3-716a-49ec-962e-25580e47f4a6\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.928906 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-logs\") pod \"31a8e3a3-716a-49ec-962e-25580e47f4a6\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.928982 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-scripts\") pod \"31a8e3a3-716a-49ec-962e-25580e47f4a6\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.929004 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pdhs\" (UniqueName: \"kubernetes.io/projected/31a8e3a3-716a-49ec-962e-25580e47f4a6-kube-api-access-7pdhs\") pod \"31a8e3a3-716a-49ec-962e-25580e47f4a6\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.929038 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-public-tls-certs\") pod \"31a8e3a3-716a-49ec-962e-25580e47f4a6\" (UID: \"31a8e3a3-716a-49ec-962e-25580e47f4a6\") " Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.930266 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "31a8e3a3-716a-49ec-962e-25580e47f4a6" (UID: "31a8e3a3-716a-49ec-962e-25580e47f4a6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.935004 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.935190 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.935961 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-logs" (OuterVolumeSpecName: "logs") pod "31a8e3a3-716a-49ec-962e-25580e47f4a6" (UID: "31a8e3a3-716a-49ec-962e-25580e47f4a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.947099 5008 scope.go:117] "RemoveContainer" containerID="c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.948580 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "31a8e3a3-716a-49ec-962e-25580e47f4a6" (UID: "31a8e3a3-716a-49ec-962e-25580e47f4a6"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.949813 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-scripts" (OuterVolumeSpecName: "scripts") pod "31a8e3a3-716a-49ec-962e-25580e47f4a6" (UID: "31a8e3a3-716a-49ec-962e-25580e47f4a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.952407 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a8e3a3-716a-49ec-962e-25580e47f4a6-kube-api-access-7pdhs" (OuterVolumeSpecName: "kube-api-access-7pdhs") pod "31a8e3a3-716a-49ec-962e-25580e47f4a6" (UID: "31a8e3a3-716a-49ec-962e-25580e47f4a6"). InnerVolumeSpecName "kube-api-access-7pdhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.953515 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.982802 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31a8e3a3-716a-49ec-962e-25580e47f4a6" (UID: "31a8e3a3-716a-49ec-962e-25580e47f4a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.990489 5008 scope.go:117] "RemoveContainer" containerID="d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.991364 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37\": container with ID starting with d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37 not found: ID does not exist" containerID="d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.991392 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37"} err="failed to get container status \"d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37\": rpc error: code = NotFound desc = could not find container \"d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37\": container with ID starting with d5cc88bf4b40c6e94e86ec0957d564d53645147700f4512ad714eb22f8a05a37 not found: ID does not exist" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.991415 5008 scope.go:117] "RemoveContainer" containerID="007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.991819 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05\": container with ID starting with 007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05 not found: ID does not exist" containerID="007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.991835 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05"} err="failed to get container status \"007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05\": rpc error: code = NotFound desc = could not find container \"007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05\": container with ID starting with 007f5e6cebbc5eba783e9ffaeba7d283c4ef9634acf91828ccd204d11e045c05 not found: ID does not exist" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.991849 5008 scope.go:117] "RemoveContainer" containerID="7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.992056 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd\": container with ID starting with 7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd not found: ID does not exist" containerID="7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.992072 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd"} err="failed to get container status \"7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd\": rpc error: code = NotFound desc = could not find container \"7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd\": container with ID starting with 7306d0bd129527e482366483290f409c7529aec9b495443f650404394a7a55fd not found: ID does not exist" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.992084 5008 scope.go:117] "RemoveContainer" containerID="c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d" Mar 18 18:24:44 crc kubenswrapper[5008]: E0318 18:24:44.993931 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d\": container with ID starting with c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d not found: ID does not exist" containerID="c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d" Mar 18 18:24:44 crc kubenswrapper[5008]: I0318 18:24:44.993972 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d"} err="failed to get container status \"c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d\": rpc error: code = NotFound desc = could not find container \"c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d\": container with ID starting with c4af4045ed63abc1a346a196c9fb093d8b23361eb32483e5f7178ef120a2bb3d not found: ID does not exist" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.026958 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-config-data" (OuterVolumeSpecName: "config-data") pod "31a8e3a3-716a-49ec-962e-25580e47f4a6" (UID: "31a8e3a3-716a-49ec-962e-25580e47f4a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.030849 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-scripts\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.030912 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.030996 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-run-httpd\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031037 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-log-httpd\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031056 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4z4m\" (UniqueName: \"kubernetes.io/projected/b5caa8f0-549e-4ace-8215-a99657497d0a-kube-api-access-m4z4m\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031303 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-config-data\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031365 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031509 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031527 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031541 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31a8e3a3-716a-49ec-962e-25580e47f4a6-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031565 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031728 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pdhs\" (UniqueName: \"kubernetes.io/projected/31a8e3a3-716a-49ec-962e-25580e47f4a6-kube-api-access-7pdhs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031745 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.031776 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.040053 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "31a8e3a3-716a-49ec-962e-25580e47f4a6" (UID: "31a8e3a3-716a-49ec-962e-25580e47f4a6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.053341 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.133084 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.133159 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-scripts\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.133190 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.133247 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-run-httpd\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.133272 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-log-httpd\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.133295 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4z4m\" (UniqueName: \"kubernetes.io/projected/b5caa8f0-549e-4ace-8215-a99657497d0a-kube-api-access-m4z4m\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.133330 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-config-data\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.133372 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a8e3a3-716a-49ec-962e-25580e47f4a6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.133383 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.134210 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-log-httpd\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.134246 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-run-httpd\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.137917 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.138008 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.138478 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-scripts\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.140990 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-config-data\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.158436 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4z4m\" (UniqueName: \"kubernetes.io/projected/b5caa8f0-549e-4ace-8215-a99657497d0a-kube-api-access-m4z4m\") pod \"ceilometer-0\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.253037 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.602071 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.606443 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b86568468-vhc29" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.682666 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b8cb95cb8-42zf7"] Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.682943 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b8cb95cb8-42zf7" podUID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerName="placement-log" containerID="cri-o://937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf" gracePeriod=30 Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.683016 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b8cb95cb8-42zf7" podUID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerName="placement-api" containerID="cri-o://049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7" gracePeriod=30 Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.787452 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.841572 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerStarted","Data":"aed5b736131af015f1941fdd6f83f6d56cb359e211190636cca3125e232c83c9"} Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.850349 5008 generic.go:334] "Generic (PLEG): container finished" podID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerID="937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf" exitCode=143 Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.850789 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b8cb95cb8-42zf7" event={"ID":"e558c4fe-4c62-49b6-bcbe-838e404d216c","Type":"ContainerDied","Data":"937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf"} Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.867013 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31a8e3a3-716a-49ec-962e-25580e47f4a6","Type":"ContainerDied","Data":"133bf0b5d01f0d7849f694f5fc6b409eeaa4a83d95037a88f2a221a5db609012"} Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.867066 5008 scope.go:117] "RemoveContainer" containerID="02df5e9b1d3df7f9943325c9d6500ce870862dd110d6174a71739dbe225d9cb4" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.867156 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.926084 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.943402 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.948982 5008 scope.go:117] "RemoveContainer" containerID="af73e056114a2a33a709e5c17ac2da30e70d25f0b9ec21b66200b676e70c1886" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.961387 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.963177 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.972014 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 18:24:45 crc kubenswrapper[5008]: I0318 18:24:45.982264 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.006313 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.054895 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-config-data\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.054966 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.055079 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdvcs\" (UniqueName: \"kubernetes.io/projected/8679cebf-8eea-45ae-be70-26eea9396f8e-kube-api-access-sdvcs\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.055124 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.055149 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.055288 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-scripts\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.055346 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-logs\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.055493 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.157177 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-logs\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.157257 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.157292 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-config-data\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.157310 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.157354 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdvcs\" (UniqueName: \"kubernetes.io/projected/8679cebf-8eea-45ae-be70-26eea9396f8e-kube-api-access-sdvcs\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.157385 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.157405 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.157430 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-scripts\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.160060 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.160947 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.161171 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-logs\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.165684 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.173962 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-config-data\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.173969 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-scripts\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.177197 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.178301 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdvcs\" (UniqueName: \"kubernetes.io/projected/8679cebf-8eea-45ae-be70-26eea9396f8e-kube-api-access-sdvcs\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.200871 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.213173 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a8e3a3-716a-49ec-962e-25580e47f4a6" path="/var/lib/kubelet/pods/31a8e3a3-716a-49ec-962e-25580e47f4a6/volumes" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.213851 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b698032-cab1-4d1d-85f2-846e38fbceba" path="/var/lib/kubelet/pods/4b698032-cab1-4d1d-85f2-846e38fbceba/volumes" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.289176 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.662540 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.769381 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.769434 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-httpd-run\") pod \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.769541 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-internal-tls-certs\") pod \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.769622 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-scripts\") pod \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.769789 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-combined-ca-bundle\") pod \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.769818 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwbh8\" (UniqueName: \"kubernetes.io/projected/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-kube-api-access-bwbh8\") pod \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.769845 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-logs\") pod \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.769918 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-config-data\") pod \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\" (UID: \"9d2a0f99-a606-4784-b6ca-9f6561d3cf93\") " Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.770308 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9d2a0f99-a606-4784-b6ca-9f6561d3cf93" (UID: "9d2a0f99-a606-4784-b6ca-9f6561d3cf93"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.770612 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-logs" (OuterVolumeSpecName: "logs") pod "9d2a0f99-a606-4784-b6ca-9f6561d3cf93" (UID: "9d2a0f99-a606-4784-b6ca-9f6561d3cf93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.773853 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-scripts" (OuterVolumeSpecName: "scripts") pod "9d2a0f99-a606-4784-b6ca-9f6561d3cf93" (UID: "9d2a0f99-a606-4784-b6ca-9f6561d3cf93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.776367 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-kube-api-access-bwbh8" (OuterVolumeSpecName: "kube-api-access-bwbh8") pod "9d2a0f99-a606-4784-b6ca-9f6561d3cf93" (UID: "9d2a0f99-a606-4784-b6ca-9f6561d3cf93"). InnerVolumeSpecName "kube-api-access-bwbh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.776779 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "9d2a0f99-a606-4784-b6ca-9f6561d3cf93" (UID: "9d2a0f99-a606-4784-b6ca-9f6561d3cf93"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.797220 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d2a0f99-a606-4784-b6ca-9f6561d3cf93" (UID: "9d2a0f99-a606-4784-b6ca-9f6561d3cf93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.822892 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9d2a0f99-a606-4784-b6ca-9f6561d3cf93" (UID: "9d2a0f99-a606-4784-b6ca-9f6561d3cf93"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.832372 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-config-data" (OuterVolumeSpecName: "config-data") pod "9d2a0f99-a606-4784-b6ca-9f6561d3cf93" (UID: "9d2a0f99-a606-4784-b6ca-9f6561d3cf93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.871495 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.871530 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.871541 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.871620 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.871632 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.871641 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwbh8\" (UniqueName: \"kubernetes.io/projected/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-kube-api-access-bwbh8\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.871649 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.871659 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d2a0f99-a606-4784-b6ca-9f6561d3cf93-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.900466 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.901096 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.919104 5008 generic.go:334] "Generic (PLEG): container finished" podID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerID="4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec" exitCode=0 Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.919159 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d2a0f99-a606-4784-b6ca-9f6561d3cf93","Type":"ContainerDied","Data":"4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec"} Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.919186 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d2a0f99-a606-4784-b6ca-9f6561d3cf93","Type":"ContainerDied","Data":"9d5b7dc39e865c570c9cf076f2bab1395af4e14f4149d549f0743576100f6a26"} Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.919202 5008 scope.go:117] "RemoveContainer" containerID="4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.919306 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: W0318 18:24:46.919409 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8679cebf_8eea_45ae_be70_26eea9396f8e.slice/crio-d2758c296b22f071f0cb5ab2f5c2f3b24fad467d0ff8428fcc456dbb490f93c2 WatchSource:0}: Error finding container d2758c296b22f071f0cb5ab2f5c2f3b24fad467d0ff8428fcc456dbb490f93c2: Status 404 returned error can't find the container with id d2758c296b22f071f0cb5ab2f5c2f3b24fad467d0ff8428fcc456dbb490f93c2 Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.925860 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerStarted","Data":"5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3"} Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.954637 5008 scope.go:117] "RemoveContainer" containerID="5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.958976 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.974832 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.976088 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.986934 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:24:46 crc kubenswrapper[5008]: E0318 18:24:46.987284 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerName="glance-log" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.987294 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerName="glance-log" Mar 18 18:24:46 crc kubenswrapper[5008]: E0318 18:24:46.987312 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerName="glance-httpd" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.987318 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerName="glance-httpd" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.987476 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerName="glance-log" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.987493 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" containerName="glance-httpd" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.988400 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.991388 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.991524 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.994331 5008 scope.go:117] "RemoveContainer" containerID="4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec" Mar 18 18:24:46 crc kubenswrapper[5008]: E0318 18:24:46.994692 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec\": container with ID starting with 4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec not found: ID does not exist" containerID="4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.994715 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec"} err="failed to get container status \"4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec\": rpc error: code = NotFound desc = could not find container \"4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec\": container with ID starting with 4da6a325e3a862c85900f07ee6336bf83662872bab02f31103bf97e761e286ec not found: ID does not exist" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.994730 5008 scope.go:117] "RemoveContainer" containerID="5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492" Mar 18 18:24:46 crc kubenswrapper[5008]: E0318 18:24:46.996168 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492\": container with ID starting with 5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492 not found: ID does not exist" containerID="5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.996193 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492"} err="failed to get container status \"5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492\": rpc error: code = NotFound desc = could not find container \"5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492\": container with ID starting with 5b503eba8269a9c14bab3bc0a7cd48f4da284b901f3c8e8402accd085dd47492 not found: ID does not exist" Mar 18 18:24:46 crc kubenswrapper[5008]: I0318 18:24:46.996783 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.078280 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.078950 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.079136 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.079220 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.079293 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.079377 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2nl\" (UniqueName: \"kubernetes.io/projected/582dafe2-2020-4966-921d-cc5e9f0db46c-kube-api-access-qg2nl\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.079705 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-logs\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.079791 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.181758 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.181809 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.181839 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.181863 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.181898 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2nl\" (UniqueName: \"kubernetes.io/projected/582dafe2-2020-4966-921d-cc5e9f0db46c-kube-api-access-qg2nl\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.181936 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-logs\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.181972 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.182048 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.183169 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.185016 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-logs\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.185131 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.193404 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.193530 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.197057 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.204806 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.216073 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2nl\" (UniqueName: \"kubernetes.io/projected/582dafe2-2020-4966-921d-cc5e9f0db46c-kube-api-access-qg2nl\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.225608 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.287715 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.858611 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.945835 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"582dafe2-2020-4966-921d-cc5e9f0db46c","Type":"ContainerStarted","Data":"b02240955d9f245f85f6f408ff050e2a3308eb99e270630612c0e482cdcaa037"} Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.947731 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerStarted","Data":"cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a"} Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.973258 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8679cebf-8eea-45ae-be70-26eea9396f8e","Type":"ContainerStarted","Data":"48672c4f4a417aeb7d70b46843dbaf3f5264f47434917232456154f0d644258b"} Mar 18 18:24:47 crc kubenswrapper[5008]: I0318 18:24:47.973321 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8679cebf-8eea-45ae-be70-26eea9396f8e","Type":"ContainerStarted","Data":"d2758c296b22f071f0cb5ab2f5c2f3b24fad467d0ff8428fcc456dbb490f93c2"} Mar 18 18:24:48 crc kubenswrapper[5008]: I0318 18:24:48.210781 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2a0f99-a606-4784-b6ca-9f6561d3cf93" path="/var/lib/kubelet/pods/9d2a0f99-a606-4784-b6ca-9f6561d3cf93/volumes" Mar 18 18:24:48 crc kubenswrapper[5008]: I0318 18:24:48.331194 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.003707 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"582dafe2-2020-4966-921d-cc5e9f0db46c","Type":"ContainerStarted","Data":"4efce2fc93ac7a338b3af0f031e3448a62c3f819288460beeb882dd6abd7cbe9"} Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.009635 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerStarted","Data":"e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1"} Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.020332 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8679cebf-8eea-45ae-be70-26eea9396f8e","Type":"ContainerStarted","Data":"1bbd2b9a3501779f9dfc17cd725e0dca6af96fcee035c33d265e2278978c5d37"} Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.055005 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.05498202 podStartE2EDuration="4.05498202s" podCreationTimestamp="2026-03-18 18:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:49.040776596 +0000 UTC m=+1345.560249695" watchObservedRunningTime="2026-03-18 18:24:49.05498202 +0000 UTC m=+1345.574455099" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.427950 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.537522 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-internal-tls-certs\") pod \"e558c4fe-4c62-49b6-bcbe-838e404d216c\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.537929 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t59j9\" (UniqueName: \"kubernetes.io/projected/e558c4fe-4c62-49b6-bcbe-838e404d216c-kube-api-access-t59j9\") pod \"e558c4fe-4c62-49b6-bcbe-838e404d216c\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.538182 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e558c4fe-4c62-49b6-bcbe-838e404d216c-logs\") pod \"e558c4fe-4c62-49b6-bcbe-838e404d216c\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.538286 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-config-data\") pod \"e558c4fe-4c62-49b6-bcbe-838e404d216c\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.538389 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-public-tls-certs\") pod \"e558c4fe-4c62-49b6-bcbe-838e404d216c\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.538600 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-combined-ca-bundle\") pod \"e558c4fe-4c62-49b6-bcbe-838e404d216c\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.538723 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-scripts\") pod \"e558c4fe-4c62-49b6-bcbe-838e404d216c\" (UID: \"e558c4fe-4c62-49b6-bcbe-838e404d216c\") " Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.541008 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e558c4fe-4c62-49b6-bcbe-838e404d216c-logs" (OuterVolumeSpecName: "logs") pod "e558c4fe-4c62-49b6-bcbe-838e404d216c" (UID: "e558c4fe-4c62-49b6-bcbe-838e404d216c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.567813 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e558c4fe-4c62-49b6-bcbe-838e404d216c-kube-api-access-t59j9" (OuterVolumeSpecName: "kube-api-access-t59j9") pod "e558c4fe-4c62-49b6-bcbe-838e404d216c" (UID: "e558c4fe-4c62-49b6-bcbe-838e404d216c"). InnerVolumeSpecName "kube-api-access-t59j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.573697 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-scripts" (OuterVolumeSpecName: "scripts") pod "e558c4fe-4c62-49b6-bcbe-838e404d216c" (UID: "e558c4fe-4c62-49b6-bcbe-838e404d216c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.624279 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e558c4fe-4c62-49b6-bcbe-838e404d216c" (UID: "e558c4fe-4c62-49b6-bcbe-838e404d216c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.633360 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-config-data" (OuterVolumeSpecName: "config-data") pod "e558c4fe-4c62-49b6-bcbe-838e404d216c" (UID: "e558c4fe-4c62-49b6-bcbe-838e404d216c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.641490 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.641523 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.641532 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t59j9\" (UniqueName: \"kubernetes.io/projected/e558c4fe-4c62-49b6-bcbe-838e404d216c-kube-api-access-t59j9\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.641543 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e558c4fe-4c62-49b6-bcbe-838e404d216c-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.641567 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.693508 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e558c4fe-4c62-49b6-bcbe-838e404d216c" (UID: "e558c4fe-4c62-49b6-bcbe-838e404d216c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.703305 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e558c4fe-4c62-49b6-bcbe-838e404d216c" (UID: "e558c4fe-4c62-49b6-bcbe-838e404d216c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.743585 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:49 crc kubenswrapper[5008]: I0318 18:24:49.743626 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e558c4fe-4c62-49b6-bcbe-838e404d216c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.032043 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"582dafe2-2020-4966-921d-cc5e9f0db46c","Type":"ContainerStarted","Data":"7c0883124c7538aea54980f006ad5cde12fc4e637570de29a2ca5d0d49c482e5"} Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.034427 5008 generic.go:334] "Generic (PLEG): container finished" podID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerID="049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7" exitCode=0 Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.034477 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b8cb95cb8-42zf7" event={"ID":"e558c4fe-4c62-49b6-bcbe-838e404d216c","Type":"ContainerDied","Data":"049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7"} Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.034521 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b8cb95cb8-42zf7" Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.034542 5008 scope.go:117] "RemoveContainer" containerID="049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7" Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.034526 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b8cb95cb8-42zf7" event={"ID":"e558c4fe-4c62-49b6-bcbe-838e404d216c","Type":"ContainerDied","Data":"df6626fa38a969d5df6aa43626b0f88a0c97d093ff1ca0cec9f59025aaa6bcbe"} Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.071719 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.071697061 podStartE2EDuration="4.071697061s" podCreationTimestamp="2026-03-18 18:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:50.0690045 +0000 UTC m=+1346.588477579" watchObservedRunningTime="2026-03-18 18:24:50.071697061 +0000 UTC m=+1346.591170140" Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.093695 5008 scope.go:117] "RemoveContainer" containerID="937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf" Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.132607 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b8cb95cb8-42zf7"] Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.160441 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6b8cb95cb8-42zf7"] Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.174689 5008 scope.go:117] "RemoveContainer" containerID="049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7" Mar 18 18:24:50 crc kubenswrapper[5008]: E0318 18:24:50.179971 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7\": container with ID starting with 049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7 not found: ID does not exist" containerID="049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7" Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.180010 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7"} err="failed to get container status \"049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7\": rpc error: code = NotFound desc = could not find container \"049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7\": container with ID starting with 049f70af73478e42ee612a1c3c40887112dadd34cbb4d041635a2619f37a1ce7 not found: ID does not exist" Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.180035 5008 scope.go:117] "RemoveContainer" containerID="937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf" Mar 18 18:24:50 crc kubenswrapper[5008]: E0318 18:24:50.195825 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf\": container with ID starting with 937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf not found: ID does not exist" containerID="937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf" Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.195879 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf"} err="failed to get container status \"937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf\": rpc error: code = NotFound desc = could not find container \"937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf\": container with ID starting with 937fbe19a64b70075afb8572736ff34948d329dab0dd7f4095eb32fd67a37baf not found: ID does not exist" Mar 18 18:24:50 crc kubenswrapper[5008]: I0318 18:24:50.255482 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e558c4fe-4c62-49b6-bcbe-838e404d216c" path="/var/lib/kubelet/pods/e558c4fe-4c62-49b6-bcbe-838e404d216c/volumes" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.045264 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerStarted","Data":"45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8"} Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.045573 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xtddx"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.045350 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="ceilometer-central-agent" containerID="cri-o://5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3" gracePeriod=30 Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.045445 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="ceilometer-notification-agent" containerID="cri-o://cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a" gracePeriod=30 Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.045482 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="sg-core" containerID="cri-o://e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1" gracePeriod=30 Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.045478 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="proxy-httpd" containerID="cri-o://45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8" gracePeriod=30 Mar 18 18:24:51 crc kubenswrapper[5008]: E0318 18:24:51.045923 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerName="placement-log" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.045935 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerName="placement-log" Mar 18 18:24:51 crc kubenswrapper[5008]: E0318 18:24:51.045972 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerName="placement-api" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.045979 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerName="placement-api" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.046150 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerName="placement-log" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.046169 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e558c4fe-4c62-49b6-bcbe-838e404d216c" containerName="placement-api" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.046674 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.047636 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.060918 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xtddx"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.101308 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.483400207 podStartE2EDuration="7.10128801s" podCreationTimestamp="2026-03-18 18:24:44 +0000 UTC" firstStartedPulling="2026-03-18 18:24:45.780646971 +0000 UTC m=+1342.300120050" lastFinishedPulling="2026-03-18 18:24:50.398534774 +0000 UTC m=+1346.918007853" observedRunningTime="2026-03-18 18:24:51.089976493 +0000 UTC m=+1347.609449572" watchObservedRunningTime="2026-03-18 18:24:51.10128801 +0000 UTC m=+1347.620761089" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.142608 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jqltp"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.143699 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.183041 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbfjh\" (UniqueName: \"kubernetes.io/projected/496ae433-798d-40a6-b049-d0f33f87b5b4-kube-api-access-lbfjh\") pod \"nova-api-db-create-xtddx\" (UID: \"496ae433-798d-40a6-b049-d0f33f87b5b4\") " pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.183129 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/496ae433-798d-40a6-b049-d0f33f87b5b4-operator-scripts\") pod \"nova-api-db-create-xtddx\" (UID: \"496ae433-798d-40a6-b049-d0f33f87b5b4\") " pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.195770 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jqltp"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.204446 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-efa2-account-create-update-xrxs7"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.205867 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.208940 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.214304 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-efa2-account-create-update-xrxs7"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.284239 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbfjh\" (UniqueName: \"kubernetes.io/projected/496ae433-798d-40a6-b049-d0f33f87b5b4-kube-api-access-lbfjh\") pod \"nova-api-db-create-xtddx\" (UID: \"496ae433-798d-40a6-b049-d0f33f87b5b4\") " pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.284296 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/496ae433-798d-40a6-b049-d0f33f87b5b4-operator-scripts\") pod \"nova-api-db-create-xtddx\" (UID: \"496ae433-798d-40a6-b049-d0f33f87b5b4\") " pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.284360 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v26fv\" (UniqueName: \"kubernetes.io/projected/a32dc881-7587-4467-a24e-80483bbd29c4-kube-api-access-v26fv\") pod \"nova-cell0-db-create-jqltp\" (UID: \"a32dc881-7587-4467-a24e-80483bbd29c4\") " pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.284388 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a32dc881-7587-4467-a24e-80483bbd29c4-operator-scripts\") pod \"nova-cell0-db-create-jqltp\" (UID: \"a32dc881-7587-4467-a24e-80483bbd29c4\") " pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.285487 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/496ae433-798d-40a6-b049-d0f33f87b5b4-operator-scripts\") pod \"nova-api-db-create-xtddx\" (UID: \"496ae433-798d-40a6-b049-d0f33f87b5b4\") " pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.301994 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbfjh\" (UniqueName: \"kubernetes.io/projected/496ae433-798d-40a6-b049-d0f33f87b5b4-kube-api-access-lbfjh\") pod \"nova-api-db-create-xtddx\" (UID: \"496ae433-798d-40a6-b049-d0f33f87b5b4\") " pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.346568 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kgmwv"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.347609 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.361068 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kgmwv"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.385578 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v26fv\" (UniqueName: \"kubernetes.io/projected/a32dc881-7587-4467-a24e-80483bbd29c4-kube-api-access-v26fv\") pod \"nova-cell0-db-create-jqltp\" (UID: \"a32dc881-7587-4467-a24e-80483bbd29c4\") " pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.385859 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a32dc881-7587-4467-a24e-80483bbd29c4-operator-scripts\") pod \"nova-cell0-db-create-jqltp\" (UID: \"a32dc881-7587-4467-a24e-80483bbd29c4\") " pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.385952 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnhw9\" (UniqueName: \"kubernetes.io/projected/6daf86c5-b733-40a0-a1e7-0991e59f4b80-kube-api-access-xnhw9\") pod \"nova-api-efa2-account-create-update-xrxs7\" (UID: \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\") " pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.386121 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6daf86c5-b733-40a0-a1e7-0991e59f4b80-operator-scripts\") pod \"nova-api-efa2-account-create-update-xrxs7\" (UID: \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\") " pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.386725 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a32dc881-7587-4467-a24e-80483bbd29c4-operator-scripts\") pod \"nova-cell0-db-create-jqltp\" (UID: \"a32dc881-7587-4467-a24e-80483bbd29c4\") " pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.387094 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.396460 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-fw58c"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.397727 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.402134 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.406101 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-fw58c"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.411959 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v26fv\" (UniqueName: \"kubernetes.io/projected/a32dc881-7587-4467-a24e-80483bbd29c4-kube-api-access-v26fv\") pod \"nova-cell0-db-create-jqltp\" (UID: \"a32dc881-7587-4467-a24e-80483bbd29c4\") " pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.488042 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fprm\" (UniqueName: \"kubernetes.io/projected/7299f042-11a3-4875-a5f8-59f18eb2df32-kube-api-access-9fprm\") pod \"nova-cell1-db-create-kgmwv\" (UID: \"7299f042-11a3-4875-a5f8-59f18eb2df32\") " pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.488121 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6daf86c5-b733-40a0-a1e7-0991e59f4b80-operator-scripts\") pod \"nova-api-efa2-account-create-update-xrxs7\" (UID: \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\") " pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.488580 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7299f042-11a3-4875-a5f8-59f18eb2df32-operator-scripts\") pod \"nova-cell1-db-create-kgmwv\" (UID: \"7299f042-11a3-4875-a5f8-59f18eb2df32\") " pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.488962 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnhw9\" (UniqueName: \"kubernetes.io/projected/6daf86c5-b733-40a0-a1e7-0991e59f4b80-kube-api-access-xnhw9\") pod \"nova-api-efa2-account-create-update-xrxs7\" (UID: \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\") " pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.489207 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6daf86c5-b733-40a0-a1e7-0991e59f4b80-operator-scripts\") pod \"nova-api-efa2-account-create-update-xrxs7\" (UID: \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\") " pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.510793 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.513598 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnhw9\" (UniqueName: \"kubernetes.io/projected/6daf86c5-b733-40a0-a1e7-0991e59f4b80-kube-api-access-xnhw9\") pod \"nova-api-efa2-account-create-update-xrxs7\" (UID: \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\") " pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.531651 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.560592 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-hpg5p"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.561888 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.563816 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.568719 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-hpg5p"] Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.590790 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fprm\" (UniqueName: \"kubernetes.io/projected/7299f042-11a3-4875-a5f8-59f18eb2df32-kube-api-access-9fprm\") pod \"nova-cell1-db-create-kgmwv\" (UID: \"7299f042-11a3-4875-a5f8-59f18eb2df32\") " pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.590870 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0593bf5-e080-4ed8-a376-f73ad47f5086-operator-scripts\") pod \"nova-cell0-2f63-account-create-update-fw58c\" (UID: \"b0593bf5-e080-4ed8-a376-f73ad47f5086\") " pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.590906 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7299f042-11a3-4875-a5f8-59f18eb2df32-operator-scripts\") pod \"nova-cell1-db-create-kgmwv\" (UID: \"7299f042-11a3-4875-a5f8-59f18eb2df32\") " pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.590972 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5ml2\" (UniqueName: \"kubernetes.io/projected/b0593bf5-e080-4ed8-a376-f73ad47f5086-kube-api-access-x5ml2\") pod \"nova-cell0-2f63-account-create-update-fw58c\" (UID: \"b0593bf5-e080-4ed8-a376-f73ad47f5086\") " pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.592210 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7299f042-11a3-4875-a5f8-59f18eb2df32-operator-scripts\") pod \"nova-cell1-db-create-kgmwv\" (UID: \"7299f042-11a3-4875-a5f8-59f18eb2df32\") " pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.615118 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fprm\" (UniqueName: \"kubernetes.io/projected/7299f042-11a3-4875-a5f8-59f18eb2df32-kube-api-access-9fprm\") pod \"nova-cell1-db-create-kgmwv\" (UID: \"7299f042-11a3-4875-a5f8-59f18eb2df32\") " pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.669546 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.692912 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316c67d-d795-4340-88e4-918b6291d950-operator-scripts\") pod \"nova-cell1-ae04-account-create-update-hpg5p\" (UID: \"1316c67d-d795-4340-88e4-918b6291d950\") " pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.692977 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5ml2\" (UniqueName: \"kubernetes.io/projected/b0593bf5-e080-4ed8-a376-f73ad47f5086-kube-api-access-x5ml2\") pod \"nova-cell0-2f63-account-create-update-fw58c\" (UID: \"b0593bf5-e080-4ed8-a376-f73ad47f5086\") " pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.693064 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvdh5\" (UniqueName: \"kubernetes.io/projected/1316c67d-d795-4340-88e4-918b6291d950-kube-api-access-wvdh5\") pod \"nova-cell1-ae04-account-create-update-hpg5p\" (UID: \"1316c67d-d795-4340-88e4-918b6291d950\") " pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.693110 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0593bf5-e080-4ed8-a376-f73ad47f5086-operator-scripts\") pod \"nova-cell0-2f63-account-create-update-fw58c\" (UID: \"b0593bf5-e080-4ed8-a376-f73ad47f5086\") " pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.693835 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0593bf5-e080-4ed8-a376-f73ad47f5086-operator-scripts\") pod \"nova-cell0-2f63-account-create-update-fw58c\" (UID: \"b0593bf5-e080-4ed8-a376-f73ad47f5086\") " pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.718337 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5ml2\" (UniqueName: \"kubernetes.io/projected/b0593bf5-e080-4ed8-a376-f73ad47f5086-kube-api-access-x5ml2\") pod \"nova-cell0-2f63-account-create-update-fw58c\" (UID: \"b0593bf5-e080-4ed8-a376-f73ad47f5086\") " pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.794746 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316c67d-d795-4340-88e4-918b6291d950-operator-scripts\") pod \"nova-cell1-ae04-account-create-update-hpg5p\" (UID: \"1316c67d-d795-4340-88e4-918b6291d950\") " pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.794865 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvdh5\" (UniqueName: \"kubernetes.io/projected/1316c67d-d795-4340-88e4-918b6291d950-kube-api-access-wvdh5\") pod \"nova-cell1-ae04-account-create-update-hpg5p\" (UID: \"1316c67d-d795-4340-88e4-918b6291d950\") " pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.796367 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316c67d-d795-4340-88e4-918b6291d950-operator-scripts\") pod \"nova-cell1-ae04-account-create-update-hpg5p\" (UID: \"1316c67d-d795-4340-88e4-918b6291d950\") " pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.802084 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.821645 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvdh5\" (UniqueName: \"kubernetes.io/projected/1316c67d-d795-4340-88e4-918b6291d950-kube-api-access-wvdh5\") pod \"nova-cell1-ae04-account-create-update-hpg5p\" (UID: \"1316c67d-d795-4340-88e4-918b6291d950\") " pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.878105 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:51 crc kubenswrapper[5008]: I0318 18:24:51.936007 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xtddx"] Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.013783 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kgmwv"] Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.078644 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-efa2-account-create-update-xrxs7"] Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.086158 5008 generic.go:334] "Generic (PLEG): container finished" podID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerID="45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8" exitCode=0 Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.086206 5008 generic.go:334] "Generic (PLEG): container finished" podID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerID="e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1" exitCode=2 Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.086218 5008 generic.go:334] "Generic (PLEG): container finished" podID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerID="cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a" exitCode=0 Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.086243 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerDied","Data":"45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8"} Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.086284 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerDied","Data":"e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1"} Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.086295 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerDied","Data":"cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a"} Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.087948 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kgmwv" event={"ID":"7299f042-11a3-4875-a5f8-59f18eb2df32","Type":"ContainerStarted","Data":"1e645dad9aeff5211ca6c4e58491eb5850576753b258b68de046dd5126e9bf4f"} Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.089310 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xtddx" event={"ID":"496ae433-798d-40a6-b049-d0f33f87b5b4","Type":"ContainerStarted","Data":"dc41dd2f8e0b52882f497749271c45a3e2235a83693875bc3de0eb205ef91e04"} Mar 18 18:24:52 crc kubenswrapper[5008]: W0318 18:24:52.115882 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6daf86c5_b733_40a0_a1e7_0991e59f4b80.slice/crio-667670d02e5e5001ee8942f9ecde9a97a7d78d68d6f385c831dd002771c4b528 WatchSource:0}: Error finding container 667670d02e5e5001ee8942f9ecde9a97a7d78d68d6f385c831dd002771c4b528: Status 404 returned error can't find the container with id 667670d02e5e5001ee8942f9ecde9a97a7d78d68d6f385c831dd002771c4b528 Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.148596 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jqltp"] Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.339924 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-fw58c"] Mar 18 18:24:52 crc kubenswrapper[5008]: W0318 18:24:52.343833 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0593bf5_e080_4ed8_a376_f73ad47f5086.slice/crio-de5646029e5bdf20c4d330bbe10dc94b9bb2f05f220c02eb3d6d4f3c64ebb362 WatchSource:0}: Error finding container de5646029e5bdf20c4d330bbe10dc94b9bb2f05f220c02eb3d6d4f3c64ebb362: Status 404 returned error can't find the container with id de5646029e5bdf20c4d330bbe10dc94b9bb2f05f220c02eb3d6d4f3c64ebb362 Mar 18 18:24:52 crc kubenswrapper[5008]: I0318 18:24:52.446694 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-hpg5p"] Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.101440 5008 generic.go:334] "Generic (PLEG): container finished" podID="7299f042-11a3-4875-a5f8-59f18eb2df32" containerID="f2d4a19da4bef551f9b95cdafd5e8f5836f3a1fef19b332a0a1d64c5b9a2f97e" exitCode=0 Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.101527 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kgmwv" event={"ID":"7299f042-11a3-4875-a5f8-59f18eb2df32","Type":"ContainerDied","Data":"f2d4a19da4bef551f9b95cdafd5e8f5836f3a1fef19b332a0a1d64c5b9a2f97e"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.103862 5008 generic.go:334] "Generic (PLEG): container finished" podID="6daf86c5-b733-40a0-a1e7-0991e59f4b80" containerID="77bd38d173d50c2a5f573f4d959340aa2ed52797345fe6a1d9a1c4829ec6805b" exitCode=0 Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.103925 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-efa2-account-create-update-xrxs7" event={"ID":"6daf86c5-b733-40a0-a1e7-0991e59f4b80","Type":"ContainerDied","Data":"77bd38d173d50c2a5f573f4d959340aa2ed52797345fe6a1d9a1c4829ec6805b"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.103952 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-efa2-account-create-update-xrxs7" event={"ID":"6daf86c5-b733-40a0-a1e7-0991e59f4b80","Type":"ContainerStarted","Data":"667670d02e5e5001ee8942f9ecde9a97a7d78d68d6f385c831dd002771c4b528"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.105611 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" event={"ID":"1316c67d-d795-4340-88e4-918b6291d950","Type":"ContainerStarted","Data":"1480f123b6b5a86f8bb731e4ddad88a0a42b8eaa878d35119d5cc9fa437d9dfd"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.105647 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" event={"ID":"1316c67d-d795-4340-88e4-918b6291d950","Type":"ContainerStarted","Data":"6b243b23a936d76331da7c33c315f32b32d2f9f9f5967652400bd3ce276c78b1"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.108227 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2f63-account-create-update-fw58c" event={"ID":"b0593bf5-e080-4ed8-a376-f73ad47f5086","Type":"ContainerStarted","Data":"7ca9a60f117bb64900ea7b4debfbb4a22c4100b53a01f204c6a0b9b167d74ccc"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.108253 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2f63-account-create-update-fw58c" event={"ID":"b0593bf5-e080-4ed8-a376-f73ad47f5086","Type":"ContainerStarted","Data":"de5646029e5bdf20c4d330bbe10dc94b9bb2f05f220c02eb3d6d4f3c64ebb362"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.114519 5008 generic.go:334] "Generic (PLEG): container finished" podID="a32dc881-7587-4467-a24e-80483bbd29c4" containerID="6aada1075429307e6ed4ed62a5ff39092be34f0cb0df6732338d9512f3d639a5" exitCode=0 Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.114595 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jqltp" event={"ID":"a32dc881-7587-4467-a24e-80483bbd29c4","Type":"ContainerDied","Data":"6aada1075429307e6ed4ed62a5ff39092be34f0cb0df6732338d9512f3d639a5"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.114619 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jqltp" event={"ID":"a32dc881-7587-4467-a24e-80483bbd29c4","Type":"ContainerStarted","Data":"62b06618960482a7439bf1bbc1b7b23666f6041e21848b601d227aeeb9112b3d"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.116474 5008 generic.go:334] "Generic (PLEG): container finished" podID="496ae433-798d-40a6-b049-d0f33f87b5b4" containerID="3e4b194f5d89fb322983ad27203a5faf38b48a02e954c6de483187aa905e57d7" exitCode=0 Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.116519 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xtddx" event={"ID":"496ae433-798d-40a6-b049-d0f33f87b5b4","Type":"ContainerDied","Data":"3e4b194f5d89fb322983ad27203a5faf38b48a02e954c6de483187aa905e57d7"} Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.213349 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-2f63-account-create-update-fw58c" podStartSLOduration=2.21332983 podStartE2EDuration="2.21332983s" podCreationTimestamp="2026-03-18 18:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:53.2079832 +0000 UTC m=+1349.727456279" watchObservedRunningTime="2026-03-18 18:24:53.21332983 +0000 UTC m=+1349.732802909" Mar 18 18:24:53 crc kubenswrapper[5008]: I0318 18:24:53.238254 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" podStartSLOduration=2.238237795 podStartE2EDuration="2.238237795s" podCreationTimestamp="2026-03-18 18:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:24:53.233215263 +0000 UTC m=+1349.752688342" watchObservedRunningTime="2026-03-18 18:24:53.238237795 +0000 UTC m=+1349.757710864" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.126601 5008 generic.go:334] "Generic (PLEG): container finished" podID="1316c67d-d795-4340-88e4-918b6291d950" containerID="1480f123b6b5a86f8bb731e4ddad88a0a42b8eaa878d35119d5cc9fa437d9dfd" exitCode=0 Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.126666 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" event={"ID":"1316c67d-d795-4340-88e4-918b6291d950","Type":"ContainerDied","Data":"1480f123b6b5a86f8bb731e4ddad88a0a42b8eaa878d35119d5cc9fa437d9dfd"} Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.128730 5008 generic.go:334] "Generic (PLEG): container finished" podID="b0593bf5-e080-4ed8-a376-f73ad47f5086" containerID="7ca9a60f117bb64900ea7b4debfbb4a22c4100b53a01f204c6a0b9b167d74ccc" exitCode=0 Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.128858 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2f63-account-create-update-fw58c" event={"ID":"b0593bf5-e080-4ed8-a376-f73ad47f5086","Type":"ContainerDied","Data":"7ca9a60f117bb64900ea7b4debfbb4a22c4100b53a01f204c6a0b9b167d74ccc"} Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.463673 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.464001 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.595750 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.742122 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fprm\" (UniqueName: \"kubernetes.io/projected/7299f042-11a3-4875-a5f8-59f18eb2df32-kube-api-access-9fprm\") pod \"7299f042-11a3-4875-a5f8-59f18eb2df32\" (UID: \"7299f042-11a3-4875-a5f8-59f18eb2df32\") " Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.742252 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7299f042-11a3-4875-a5f8-59f18eb2df32-operator-scripts\") pod \"7299f042-11a3-4875-a5f8-59f18eb2df32\" (UID: \"7299f042-11a3-4875-a5f8-59f18eb2df32\") " Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.742946 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7299f042-11a3-4875-a5f8-59f18eb2df32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7299f042-11a3-4875-a5f8-59f18eb2df32" (UID: "7299f042-11a3-4875-a5f8-59f18eb2df32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.747932 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7299f042-11a3-4875-a5f8-59f18eb2df32-kube-api-access-9fprm" (OuterVolumeSpecName: "kube-api-access-9fprm") pod "7299f042-11a3-4875-a5f8-59f18eb2df32" (UID: "7299f042-11a3-4875-a5f8-59f18eb2df32"). InnerVolumeSpecName "kube-api-access-9fprm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.803110 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.809735 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.824860 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.844567 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7299f042-11a3-4875-a5f8-59f18eb2df32-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.844611 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fprm\" (UniqueName: \"kubernetes.io/projected/7299f042-11a3-4875-a5f8-59f18eb2df32-kube-api-access-9fprm\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.945777 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbfjh\" (UniqueName: \"kubernetes.io/projected/496ae433-798d-40a6-b049-d0f33f87b5b4-kube-api-access-lbfjh\") pod \"496ae433-798d-40a6-b049-d0f33f87b5b4\" (UID: \"496ae433-798d-40a6-b049-d0f33f87b5b4\") " Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.945899 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a32dc881-7587-4467-a24e-80483bbd29c4-operator-scripts\") pod \"a32dc881-7587-4467-a24e-80483bbd29c4\" (UID: \"a32dc881-7587-4467-a24e-80483bbd29c4\") " Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.946016 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnhw9\" (UniqueName: \"kubernetes.io/projected/6daf86c5-b733-40a0-a1e7-0991e59f4b80-kube-api-access-xnhw9\") pod \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\" (UID: \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\") " Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.946042 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v26fv\" (UniqueName: \"kubernetes.io/projected/a32dc881-7587-4467-a24e-80483bbd29c4-kube-api-access-v26fv\") pod \"a32dc881-7587-4467-a24e-80483bbd29c4\" (UID: \"a32dc881-7587-4467-a24e-80483bbd29c4\") " Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.946074 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6daf86c5-b733-40a0-a1e7-0991e59f4b80-operator-scripts\") pod \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\" (UID: \"6daf86c5-b733-40a0-a1e7-0991e59f4b80\") " Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.946155 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/496ae433-798d-40a6-b049-d0f33f87b5b4-operator-scripts\") pod \"496ae433-798d-40a6-b049-d0f33f87b5b4\" (UID: \"496ae433-798d-40a6-b049-d0f33f87b5b4\") " Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.946350 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a32dc881-7587-4467-a24e-80483bbd29c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a32dc881-7587-4467-a24e-80483bbd29c4" (UID: "a32dc881-7587-4467-a24e-80483bbd29c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.946532 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6daf86c5-b733-40a0-a1e7-0991e59f4b80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6daf86c5-b733-40a0-a1e7-0991e59f4b80" (UID: "6daf86c5-b733-40a0-a1e7-0991e59f4b80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.946667 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a32dc881-7587-4467-a24e-80483bbd29c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.946686 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6daf86c5-b733-40a0-a1e7-0991e59f4b80-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.946768 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496ae433-798d-40a6-b049-d0f33f87b5b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "496ae433-798d-40a6-b049-d0f33f87b5b4" (UID: "496ae433-798d-40a6-b049-d0f33f87b5b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.950487 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a32dc881-7587-4467-a24e-80483bbd29c4-kube-api-access-v26fv" (OuterVolumeSpecName: "kube-api-access-v26fv") pod "a32dc881-7587-4467-a24e-80483bbd29c4" (UID: "a32dc881-7587-4467-a24e-80483bbd29c4"). InnerVolumeSpecName "kube-api-access-v26fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.950693 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6daf86c5-b733-40a0-a1e7-0991e59f4b80-kube-api-access-xnhw9" (OuterVolumeSpecName: "kube-api-access-xnhw9") pod "6daf86c5-b733-40a0-a1e7-0991e59f4b80" (UID: "6daf86c5-b733-40a0-a1e7-0991e59f4b80"). InnerVolumeSpecName "kube-api-access-xnhw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:54 crc kubenswrapper[5008]: I0318 18:24:54.950841 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496ae433-798d-40a6-b049-d0f33f87b5b4-kube-api-access-lbfjh" (OuterVolumeSpecName: "kube-api-access-lbfjh") pod "496ae433-798d-40a6-b049-d0f33f87b5b4" (UID: "496ae433-798d-40a6-b049-d0f33f87b5b4"). InnerVolumeSpecName "kube-api-access-lbfjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.048675 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnhw9\" (UniqueName: \"kubernetes.io/projected/6daf86c5-b733-40a0-a1e7-0991e59f4b80-kube-api-access-xnhw9\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.048722 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v26fv\" (UniqueName: \"kubernetes.io/projected/a32dc881-7587-4467-a24e-80483bbd29c4-kube-api-access-v26fv\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.048732 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/496ae433-798d-40a6-b049-d0f33f87b5b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.048740 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbfjh\" (UniqueName: \"kubernetes.io/projected/496ae433-798d-40a6-b049-d0f33f87b5b4-kube-api-access-lbfjh\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.140168 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kgmwv" event={"ID":"7299f042-11a3-4875-a5f8-59f18eb2df32","Type":"ContainerDied","Data":"1e645dad9aeff5211ca6c4e58491eb5850576753b258b68de046dd5126e9bf4f"} Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.140208 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e645dad9aeff5211ca6c4e58491eb5850576753b258b68de046dd5126e9bf4f" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.140178 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kgmwv" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.143643 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-efa2-account-create-update-xrxs7" event={"ID":"6daf86c5-b733-40a0-a1e7-0991e59f4b80","Type":"ContainerDied","Data":"667670d02e5e5001ee8942f9ecde9a97a7d78d68d6f385c831dd002771c4b528"} Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.143688 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667670d02e5e5001ee8942f9ecde9a97a7d78d68d6f385c831dd002771c4b528" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.144055 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-efa2-account-create-update-xrxs7" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.145515 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jqltp" event={"ID":"a32dc881-7587-4467-a24e-80483bbd29c4","Type":"ContainerDied","Data":"62b06618960482a7439bf1bbc1b7b23666f6041e21848b601d227aeeb9112b3d"} Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.145542 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b06618960482a7439bf1bbc1b7b23666f6041e21848b601d227aeeb9112b3d" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.146121 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jqltp" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.147260 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xtddx" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.149776 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xtddx" event={"ID":"496ae433-798d-40a6-b049-d0f33f87b5b4","Type":"ContainerDied","Data":"dc41dd2f8e0b52882f497749271c45a3e2235a83693875bc3de0eb205ef91e04"} Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.149821 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc41dd2f8e0b52882f497749271c45a3e2235a83693875bc3de0eb205ef91e04" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.489286 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.497947 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.662730 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvdh5\" (UniqueName: \"kubernetes.io/projected/1316c67d-d795-4340-88e4-918b6291d950-kube-api-access-wvdh5\") pod \"1316c67d-d795-4340-88e4-918b6291d950\" (UID: \"1316c67d-d795-4340-88e4-918b6291d950\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.662784 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5ml2\" (UniqueName: \"kubernetes.io/projected/b0593bf5-e080-4ed8-a376-f73ad47f5086-kube-api-access-x5ml2\") pod \"b0593bf5-e080-4ed8-a376-f73ad47f5086\" (UID: \"b0593bf5-e080-4ed8-a376-f73ad47f5086\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.662822 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316c67d-d795-4340-88e4-918b6291d950-operator-scripts\") pod \"1316c67d-d795-4340-88e4-918b6291d950\" (UID: \"1316c67d-d795-4340-88e4-918b6291d950\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.662950 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0593bf5-e080-4ed8-a376-f73ad47f5086-operator-scripts\") pod \"b0593bf5-e080-4ed8-a376-f73ad47f5086\" (UID: \"b0593bf5-e080-4ed8-a376-f73ad47f5086\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.666035 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1316c67d-d795-4340-88e4-918b6291d950-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1316c67d-d795-4340-88e4-918b6291d950" (UID: "1316c67d-d795-4340-88e4-918b6291d950"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.666831 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0593bf5-e080-4ed8-a376-f73ad47f5086-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0593bf5-e080-4ed8-a376-f73ad47f5086" (UID: "b0593bf5-e080-4ed8-a376-f73ad47f5086"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.667379 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0593bf5-e080-4ed8-a376-f73ad47f5086-kube-api-access-x5ml2" (OuterVolumeSpecName: "kube-api-access-x5ml2") pod "b0593bf5-e080-4ed8-a376-f73ad47f5086" (UID: "b0593bf5-e080-4ed8-a376-f73ad47f5086"). InnerVolumeSpecName "kube-api-access-x5ml2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.668314 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1316c67d-d795-4340-88e4-918b6291d950-kube-api-access-wvdh5" (OuterVolumeSpecName: "kube-api-access-wvdh5") pod "1316c67d-d795-4340-88e4-918b6291d950" (UID: "1316c67d-d795-4340-88e4-918b6291d950"). InnerVolumeSpecName "kube-api-access-wvdh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.753298 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.767436 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvdh5\" (UniqueName: \"kubernetes.io/projected/1316c67d-d795-4340-88e4-918b6291d950-kube-api-access-wvdh5\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.767479 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5ml2\" (UniqueName: \"kubernetes.io/projected/b0593bf5-e080-4ed8-a376-f73ad47f5086-kube-api-access-x5ml2\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.767492 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1316c67d-d795-4340-88e4-918b6291d950-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.767503 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0593bf5-e080-4ed8-a376-f73ad47f5086-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.868797 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4z4m\" (UniqueName: \"kubernetes.io/projected/b5caa8f0-549e-4ace-8215-a99657497d0a-kube-api-access-m4z4m\") pod \"b5caa8f0-549e-4ace-8215-a99657497d0a\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.868992 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-log-httpd\") pod \"b5caa8f0-549e-4ace-8215-a99657497d0a\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.869011 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-sg-core-conf-yaml\") pod \"b5caa8f0-549e-4ace-8215-a99657497d0a\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.869071 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-scripts\") pod \"b5caa8f0-549e-4ace-8215-a99657497d0a\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.869094 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-run-httpd\") pod \"b5caa8f0-549e-4ace-8215-a99657497d0a\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.869137 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-combined-ca-bundle\") pod \"b5caa8f0-549e-4ace-8215-a99657497d0a\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.869155 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-config-data\") pod \"b5caa8f0-549e-4ace-8215-a99657497d0a\" (UID: \"b5caa8f0-549e-4ace-8215-a99657497d0a\") " Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.869776 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b5caa8f0-549e-4ace-8215-a99657497d0a" (UID: "b5caa8f0-549e-4ace-8215-a99657497d0a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.869967 5008 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.870069 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b5caa8f0-549e-4ace-8215-a99657497d0a" (UID: "b5caa8f0-549e-4ace-8215-a99657497d0a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.880718 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-scripts" (OuterVolumeSpecName: "scripts") pod "b5caa8f0-549e-4ace-8215-a99657497d0a" (UID: "b5caa8f0-549e-4ace-8215-a99657497d0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.882062 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5caa8f0-549e-4ace-8215-a99657497d0a-kube-api-access-m4z4m" (OuterVolumeSpecName: "kube-api-access-m4z4m") pod "b5caa8f0-549e-4ace-8215-a99657497d0a" (UID: "b5caa8f0-549e-4ace-8215-a99657497d0a"). InnerVolumeSpecName "kube-api-access-m4z4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.900814 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b5caa8f0-549e-4ace-8215-a99657497d0a" (UID: "b5caa8f0-549e-4ace-8215-a99657497d0a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.959569 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5caa8f0-549e-4ace-8215-a99657497d0a" (UID: "b5caa8f0-549e-4ace-8215-a99657497d0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.971692 5008 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5caa8f0-549e-4ace-8215-a99657497d0a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.971724 5008 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.971735 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.971744 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.971754 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4z4m\" (UniqueName: \"kubernetes.io/projected/b5caa8f0-549e-4ace-8215-a99657497d0a-kube-api-access-m4z4m\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:55 crc kubenswrapper[5008]: I0318 18:24:55.980284 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-config-data" (OuterVolumeSpecName: "config-data") pod "b5caa8f0-549e-4ace-8215-a99657497d0a" (UID: "b5caa8f0-549e-4ace-8215-a99657497d0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.073179 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5caa8f0-549e-4ace-8215-a99657497d0a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.166031 5008 generic.go:334] "Generic (PLEG): container finished" podID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerID="5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3" exitCode=0 Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.166071 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerDied","Data":"5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3"} Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.166149 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b5caa8f0-549e-4ace-8215-a99657497d0a","Type":"ContainerDied","Data":"aed5b736131af015f1941fdd6f83f6d56cb359e211190636cca3125e232c83c9"} Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.166165 5008 scope.go:117] "RemoveContainer" containerID="45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.166236 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.168049 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.168046 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ae04-account-create-update-hpg5p" event={"ID":"1316c67d-d795-4340-88e4-918b6291d950","Type":"ContainerDied","Data":"6b243b23a936d76331da7c33c315f32b32d2f9f9f5967652400bd3ce276c78b1"} Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.168093 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b243b23a936d76331da7c33c315f32b32d2f9f9f5967652400bd3ce276c78b1" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.170730 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2f63-account-create-update-fw58c" event={"ID":"b0593bf5-e080-4ed8-a376-f73ad47f5086","Type":"ContainerDied","Data":"de5646029e5bdf20c4d330bbe10dc94b9bb2f05f220c02eb3d6d4f3c64ebb362"} Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.170771 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de5646029e5bdf20c4d330bbe10dc94b9bb2f05f220c02eb3d6d4f3c64ebb362" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.170791 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2f63-account-create-update-fw58c" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.198007 5008 scope.go:117] "RemoveContainer" containerID="e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.260604 5008 scope.go:117] "RemoveContainer" containerID="cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.276287 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.276329 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.288491 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289196 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="ceilometer-notification-agent" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289215 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="ceilometer-notification-agent" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289231 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="ceilometer-central-agent" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289237 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="ceilometer-central-agent" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289254 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="proxy-httpd" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289260 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="proxy-httpd" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289276 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0593bf5-e080-4ed8-a376-f73ad47f5086" containerName="mariadb-account-create-update" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289283 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0593bf5-e080-4ed8-a376-f73ad47f5086" containerName="mariadb-account-create-update" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289298 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="sg-core" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289303 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="sg-core" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289313 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7299f042-11a3-4875-a5f8-59f18eb2df32" containerName="mariadb-database-create" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289319 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="7299f042-11a3-4875-a5f8-59f18eb2df32" containerName="mariadb-database-create" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289325 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6daf86c5-b733-40a0-a1e7-0991e59f4b80" containerName="mariadb-account-create-update" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289332 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6daf86c5-b733-40a0-a1e7-0991e59f4b80" containerName="mariadb-account-create-update" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289344 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32dc881-7587-4467-a24e-80483bbd29c4" containerName="mariadb-database-create" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289350 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32dc881-7587-4467-a24e-80483bbd29c4" containerName="mariadb-database-create" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289361 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="496ae433-798d-40a6-b049-d0f33f87b5b4" containerName="mariadb-database-create" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289366 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="496ae433-798d-40a6-b049-d0f33f87b5b4" containerName="mariadb-database-create" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.289379 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1316c67d-d795-4340-88e4-918b6291d950" containerName="mariadb-account-create-update" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289386 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1316c67d-d795-4340-88e4-918b6291d950" containerName="mariadb-account-create-update" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289537 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0593bf5-e080-4ed8-a376-f73ad47f5086" containerName="mariadb-account-create-update" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289549 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6daf86c5-b733-40a0-a1e7-0991e59f4b80" containerName="mariadb-account-create-update" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289592 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="sg-core" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289603 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1316c67d-d795-4340-88e4-918b6291d950" containerName="mariadb-account-create-update" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289612 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="7299f042-11a3-4875-a5f8-59f18eb2df32" containerName="mariadb-database-create" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289626 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="ceilometer-notification-agent" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289633 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="ceilometer-central-agent" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289643 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" containerName="proxy-httpd" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289653 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a32dc881-7587-4467-a24e-80483bbd29c4" containerName="mariadb-database-create" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289665 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="496ae433-798d-40a6-b049-d0f33f87b5b4" containerName="mariadb-database-create" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.289914 5008 scope.go:117] "RemoveContainer" containerID="5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.291147 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.291167 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.291498 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.295908 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.296369 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.296939 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.332803 5008 scope.go:117] "RemoveContainer" containerID="45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.333372 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8\": container with ID starting with 45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8 not found: ID does not exist" containerID="45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.333418 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8"} err="failed to get container status \"45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8\": rpc error: code = NotFound desc = could not find container \"45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8\": container with ID starting with 45610aaeba6fc2f526c220c09dc00b1bdfea1b531dcebee55904fa8c8b98bad8 not found: ID does not exist" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.333448 5008 scope.go:117] "RemoveContainer" containerID="e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.333780 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1\": container with ID starting with e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1 not found: ID does not exist" containerID="e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.333830 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1"} err="failed to get container status \"e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1\": rpc error: code = NotFound desc = could not find container \"e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1\": container with ID starting with e9517de439a689cab6b942f49bc18c83e5ea018fcb3d860d6a8c7d4cb56849b1 not found: ID does not exist" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.333857 5008 scope.go:117] "RemoveContainer" containerID="cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.334122 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a\": container with ID starting with cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a not found: ID does not exist" containerID="cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.334150 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a"} err="failed to get container status \"cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a\": rpc error: code = NotFound desc = could not find container \"cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a\": container with ID starting with cf4a40ff1479075a6371209287d3d3ae6dcae4e36ec269c423c033d2f3268d8a not found: ID does not exist" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.334167 5008 scope.go:117] "RemoveContainer" containerID="5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3" Mar 18 18:24:56 crc kubenswrapper[5008]: E0318 18:24:56.334340 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3\": container with ID starting with 5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3 not found: ID does not exist" containerID="5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.334359 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3"} err="failed to get container status \"5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3\": rpc error: code = NotFound desc = could not find container \"5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3\": container with ID starting with 5e2e803deb3c5464f3549b2db0ada3f8fb8db251aaaddc2f2f513b497e11c6e3 not found: ID does not exist" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.339979 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.360139 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.481187 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-config-data\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.481243 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-run-httpd\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.481332 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-log-httpd\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.481350 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-245ff\" (UniqueName: \"kubernetes.io/projected/3f913711-2d07-4cd4-84f0-a28baffb6b79-kube-api-access-245ff\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.481374 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.481486 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.481542 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-scripts\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.582976 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-scripts\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.583022 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-config-data\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.583050 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-run-httpd\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.583076 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-log-httpd\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.583091 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-245ff\" (UniqueName: \"kubernetes.io/projected/3f913711-2d07-4cd4-84f0-a28baffb6b79-kube-api-access-245ff\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.583115 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.583205 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.584678 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-log-httpd\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.584726 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-run-httpd\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.588871 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.592390 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.592626 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-scripts\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.593439 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-config-data\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.632459 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-245ff\" (UniqueName: \"kubernetes.io/projected/3f913711-2d07-4cd4-84f0-a28baffb6b79-kube-api-access-245ff\") pod \"ceilometer-0\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " pod="openstack/ceilometer-0" Mar 18 18:24:56 crc kubenswrapper[5008]: I0318 18:24:56.926105 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:24:57 crc kubenswrapper[5008]: I0318 18:24:57.181304 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 18:24:57 crc kubenswrapper[5008]: I0318 18:24:57.181689 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 18:24:57 crc kubenswrapper[5008]: I0318 18:24:57.290273 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:57 crc kubenswrapper[5008]: I0318 18:24:57.290345 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:57 crc kubenswrapper[5008]: I0318 18:24:57.320662 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:57 crc kubenswrapper[5008]: I0318 18:24:57.330393 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:57 crc kubenswrapper[5008]: W0318 18:24:57.404402 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f913711_2d07_4cd4_84f0_a28baffb6b79.slice/crio-07ebb703e28bf764f63d40b4133e5c9080bbf1dc7a099b7659e15a7901f546d8 WatchSource:0}: Error finding container 07ebb703e28bf764f63d40b4133e5c9080bbf1dc7a099b7659e15a7901f546d8: Status 404 returned error can't find the container with id 07ebb703e28bf764f63d40b4133e5c9080bbf1dc7a099b7659e15a7901f546d8 Mar 18 18:24:57 crc kubenswrapper[5008]: I0318 18:24:57.412973 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:24:58 crc kubenswrapper[5008]: I0318 18:24:58.193711 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerStarted","Data":"07ebb703e28bf764f63d40b4133e5c9080bbf1dc7a099b7659e15a7901f546d8"} Mar 18 18:24:58 crc kubenswrapper[5008]: I0318 18:24:58.194250 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:58 crc kubenswrapper[5008]: I0318 18:24:58.194292 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 18:24:58 crc kubenswrapper[5008]: I0318 18:24:58.209110 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5caa8f0-549e-4ace-8215-a99657497d0a" path="/var/lib/kubelet/pods/b5caa8f0-549e-4ace-8215-a99657497d0a/volumes" Mar 18 18:24:59 crc kubenswrapper[5008]: I0318 18:24:59.119455 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 18:24:59 crc kubenswrapper[5008]: I0318 18:24:59.158063 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 18:24:59 crc kubenswrapper[5008]: I0318 18:24:59.215597 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerStarted","Data":"824ddbd5e53959ad5a073db431e345df2f3e0e1e485d22b5c38e48498c093a6c"} Mar 18 18:25:00 crc kubenswrapper[5008]: I0318 18:25:00.223655 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 18:25:00 crc kubenswrapper[5008]: I0318 18:25:00.225177 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerStarted","Data":"7d9b1ff3a3c427f915bb19b684cfd611a22229c718fa1588cb7c471b62494d87"} Mar 18 18:25:00 crc kubenswrapper[5008]: I0318 18:25:00.225260 5008 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 18:25:00 crc kubenswrapper[5008]: I0318 18:25:00.385084 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.236257 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerStarted","Data":"d47b1282daf1f99beac7e27d1625c58e8f607ddda65adc7bfd8f293f808b44c2"} Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.730749 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7bp5n"] Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.735976 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.745032 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.746810 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.747090 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7bp5n"] Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.748744 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-24rgk" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.779006 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.779164 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9wvq\" (UniqueName: \"kubernetes.io/projected/671ab4c3-3b06-428a-83e4-104ca0251bf1-kube-api-access-g9wvq\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.779208 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-scripts\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.779248 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-config-data\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.880867 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-scripts\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.880973 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-config-data\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.881087 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.881304 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wvq\" (UniqueName: \"kubernetes.io/projected/671ab4c3-3b06-428a-83e4-104ca0251bf1-kube-api-access-g9wvq\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.890237 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-scripts\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.890327 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-config-data\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.890332 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:01 crc kubenswrapper[5008]: I0318 18:25:01.910122 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9wvq\" (UniqueName: \"kubernetes.io/projected/671ab4c3-3b06-428a-83e4-104ca0251bf1-kube-api-access-g9wvq\") pod \"nova-cell0-conductor-db-sync-7bp5n\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:02 crc kubenswrapper[5008]: I0318 18:25:02.051256 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:02 crc kubenswrapper[5008]: I0318 18:25:02.576238 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7bp5n"] Mar 18 18:25:03 crc kubenswrapper[5008]: I0318 18:25:03.255383 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7bp5n" event={"ID":"671ab4c3-3b06-428a-83e4-104ca0251bf1","Type":"ContainerStarted","Data":"b8de6ce4426afb58dcb6e0dc408c3eaf0ef87c7eebfef712fb582bb7fe359ff2"} Mar 18 18:25:04 crc kubenswrapper[5008]: I0318 18:25:04.267941 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerStarted","Data":"2b0fc7f85c9c320c40ca861c593a3620b70d44e48bb0ed934a67d01439544ee5"} Mar 18 18:25:04 crc kubenswrapper[5008]: I0318 18:25:04.269312 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:25:04 crc kubenswrapper[5008]: I0318 18:25:04.289874 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9952147249999999 podStartE2EDuration="8.289854983s" podCreationTimestamp="2026-03-18 18:24:56 +0000 UTC" firstStartedPulling="2026-03-18 18:24:57.406370983 +0000 UTC m=+1353.925844062" lastFinishedPulling="2026-03-18 18:25:03.701011241 +0000 UTC m=+1360.220484320" observedRunningTime="2026-03-18 18:25:04.288711173 +0000 UTC m=+1360.808184282" watchObservedRunningTime="2026-03-18 18:25:04.289854983 +0000 UTC m=+1360.809328062" Mar 18 18:25:10 crc kubenswrapper[5008]: I0318 18:25:10.336230 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7bp5n" event={"ID":"671ab4c3-3b06-428a-83e4-104ca0251bf1","Type":"ContainerStarted","Data":"ecc55674f283d6170293a51c52687d333d59880f248c46a61dc6f3dc815eed79"} Mar 18 18:25:19 crc kubenswrapper[5008]: I0318 18:25:19.437023 5008 generic.go:334] "Generic (PLEG): container finished" podID="671ab4c3-3b06-428a-83e4-104ca0251bf1" containerID="ecc55674f283d6170293a51c52687d333d59880f248c46a61dc6f3dc815eed79" exitCode=0 Mar 18 18:25:19 crc kubenswrapper[5008]: I0318 18:25:19.437107 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7bp5n" event={"ID":"671ab4c3-3b06-428a-83e4-104ca0251bf1","Type":"ContainerDied","Data":"ecc55674f283d6170293a51c52687d333d59880f248c46a61dc6f3dc815eed79"} Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.775409 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.882237 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9wvq\" (UniqueName: \"kubernetes.io/projected/671ab4c3-3b06-428a-83e4-104ca0251bf1-kube-api-access-g9wvq\") pod \"671ab4c3-3b06-428a-83e4-104ca0251bf1\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.882307 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-scripts\") pod \"671ab4c3-3b06-428a-83e4-104ca0251bf1\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.882390 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-config-data\") pod \"671ab4c3-3b06-428a-83e4-104ca0251bf1\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.883229 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-combined-ca-bundle\") pod \"671ab4c3-3b06-428a-83e4-104ca0251bf1\" (UID: \"671ab4c3-3b06-428a-83e4-104ca0251bf1\") " Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.887587 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671ab4c3-3b06-428a-83e4-104ca0251bf1-kube-api-access-g9wvq" (OuterVolumeSpecName: "kube-api-access-g9wvq") pod "671ab4c3-3b06-428a-83e4-104ca0251bf1" (UID: "671ab4c3-3b06-428a-83e4-104ca0251bf1"). InnerVolumeSpecName "kube-api-access-g9wvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.887897 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-scripts" (OuterVolumeSpecName: "scripts") pod "671ab4c3-3b06-428a-83e4-104ca0251bf1" (UID: "671ab4c3-3b06-428a-83e4-104ca0251bf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.911735 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-config-data" (OuterVolumeSpecName: "config-data") pod "671ab4c3-3b06-428a-83e4-104ca0251bf1" (UID: "671ab4c3-3b06-428a-83e4-104ca0251bf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.914203 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "671ab4c3-3b06-428a-83e4-104ca0251bf1" (UID: "671ab4c3-3b06-428a-83e4-104ca0251bf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.986404 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.986430 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9wvq\" (UniqueName: \"kubernetes.io/projected/671ab4c3-3b06-428a-83e4-104ca0251bf1-kube-api-access-g9wvq\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.986440 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:20 crc kubenswrapper[5008]: I0318 18:25:20.986449 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ab4c3-3b06-428a-83e4-104ca0251bf1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.463207 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7bp5n" event={"ID":"671ab4c3-3b06-428a-83e4-104ca0251bf1","Type":"ContainerDied","Data":"b8de6ce4426afb58dcb6e0dc408c3eaf0ef87c7eebfef712fb582bb7fe359ff2"} Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.463601 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8de6ce4426afb58dcb6e0dc408c3eaf0ef87c7eebfef712fb582bb7fe359ff2" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.463298 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7bp5n" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.613034 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:25:21 crc kubenswrapper[5008]: E0318 18:25:21.613627 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671ab4c3-3b06-428a-83e4-104ca0251bf1" containerName="nova-cell0-conductor-db-sync" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.613659 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ab4c3-3b06-428a-83e4-104ca0251bf1" containerName="nova-cell0-conductor-db-sync" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.613964 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="671ab4c3-3b06-428a-83e4-104ca0251bf1" containerName="nova-cell0-conductor-db-sync" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.614957 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.618778 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.620405 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-24rgk" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.625020 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.698400 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.698524 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlwq9\" (UniqueName: \"kubernetes.io/projected/ed55404d-2d05-4776-abed-7579ae87933d-kube-api-access-xlwq9\") pod \"nova-cell0-conductor-0\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.698621 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.800269 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.800372 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlwq9\" (UniqueName: \"kubernetes.io/projected/ed55404d-2d05-4776-abed-7579ae87933d-kube-api-access-xlwq9\") pod \"nova-cell0-conductor-0\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.800442 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.805961 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.810371 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.825153 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlwq9\" (UniqueName: \"kubernetes.io/projected/ed55404d-2d05-4776-abed-7579ae87933d-kube-api-access-xlwq9\") pod \"nova-cell0-conductor-0\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:21 crc kubenswrapper[5008]: I0318 18:25:21.934155 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:22 crc kubenswrapper[5008]: I0318 18:25:22.380167 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:25:22 crc kubenswrapper[5008]: W0318 18:25:22.386947 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded55404d_2d05_4776_abed_7579ae87933d.slice/crio-82739a31c47e5120994fbbbb5e23bab6db863c7c56fbd5559b125868e9f21935 WatchSource:0}: Error finding container 82739a31c47e5120994fbbbb5e23bab6db863c7c56fbd5559b125868e9f21935: Status 404 returned error can't find the container with id 82739a31c47e5120994fbbbb5e23bab6db863c7c56fbd5559b125868e9f21935 Mar 18 18:25:22 crc kubenswrapper[5008]: I0318 18:25:22.474983 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ed55404d-2d05-4776-abed-7579ae87933d","Type":"ContainerStarted","Data":"82739a31c47e5120994fbbbb5e23bab6db863c7c56fbd5559b125868e9f21935"} Mar 18 18:25:23 crc kubenswrapper[5008]: I0318 18:25:23.486541 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ed55404d-2d05-4776-abed-7579ae87933d","Type":"ContainerStarted","Data":"09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65"} Mar 18 18:25:23 crc kubenswrapper[5008]: I0318 18:25:23.486992 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:23 crc kubenswrapper[5008]: I0318 18:25:23.521955 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.52193641 podStartE2EDuration="2.52193641s" podCreationTimestamp="2026-03-18 18:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:25:23.516423475 +0000 UTC m=+1380.035896564" watchObservedRunningTime="2026-03-18 18:25:23.52193641 +0000 UTC m=+1380.041409489" Mar 18 18:25:24 crc kubenswrapper[5008]: I0318 18:25:24.460083 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:25:24 crc kubenswrapper[5008]: I0318 18:25:24.460622 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:25:24 crc kubenswrapper[5008]: I0318 18:25:24.460719 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:25:24 crc kubenswrapper[5008]: I0318 18:25:24.461701 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b61732c3f8965875c6dd9c25b3aae8cc8d81ecff790f1af827da9944801bd467"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:25:24 crc kubenswrapper[5008]: I0318 18:25:24.461815 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://b61732c3f8965875c6dd9c25b3aae8cc8d81ecff790f1af827da9944801bd467" gracePeriod=600 Mar 18 18:25:25 crc kubenswrapper[5008]: I0318 18:25:25.515642 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="b61732c3f8965875c6dd9c25b3aae8cc8d81ecff790f1af827da9944801bd467" exitCode=0 Mar 18 18:25:25 crc kubenswrapper[5008]: I0318 18:25:25.515704 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"b61732c3f8965875c6dd9c25b3aae8cc8d81ecff790f1af827da9944801bd467"} Mar 18 18:25:25 crc kubenswrapper[5008]: I0318 18:25:25.516358 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241"} Mar 18 18:25:25 crc kubenswrapper[5008]: I0318 18:25:25.516394 5008 scope.go:117] "RemoveContainer" containerID="fd022c5c3ebfc1487f31b5991ba1ecc58d2f77c9bf3db917b976667648d3cca3" Mar 18 18:25:26 crc kubenswrapper[5008]: I0318 18:25:26.931796 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 18:25:30 crc kubenswrapper[5008]: I0318 18:25:30.648814 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:25:30 crc kubenswrapper[5008]: I0318 18:25:30.649823 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="96ede22a-8990-49f1-8bb9-f0c08bb3c8b3" containerName="kube-state-metrics" containerID="cri-o://abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee" gracePeriod=30 Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.161272 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.286902 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5qms\" (UniqueName: \"kubernetes.io/projected/96ede22a-8990-49f1-8bb9-f0c08bb3c8b3-kube-api-access-w5qms\") pod \"96ede22a-8990-49f1-8bb9-f0c08bb3c8b3\" (UID: \"96ede22a-8990-49f1-8bb9-f0c08bb3c8b3\") " Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.295758 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ede22a-8990-49f1-8bb9-f0c08bb3c8b3-kube-api-access-w5qms" (OuterVolumeSpecName: "kube-api-access-w5qms") pod "96ede22a-8990-49f1-8bb9-f0c08bb3c8b3" (UID: "96ede22a-8990-49f1-8bb9-f0c08bb3c8b3"). InnerVolumeSpecName "kube-api-access-w5qms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.389173 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5qms\" (UniqueName: \"kubernetes.io/projected/96ede22a-8990-49f1-8bb9-f0c08bb3c8b3-kube-api-access-w5qms\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.587073 5008 generic.go:334] "Generic (PLEG): container finished" podID="96ede22a-8990-49f1-8bb9-f0c08bb3c8b3" containerID="abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee" exitCode=2 Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.587120 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96ede22a-8990-49f1-8bb9-f0c08bb3c8b3","Type":"ContainerDied","Data":"abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee"} Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.587150 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"96ede22a-8990-49f1-8bb9-f0c08bb3c8b3","Type":"ContainerDied","Data":"24ae8ff3e7ba0ad952a9120f4330a6b8038c5dfcd70abbd4762bf02cfeca2300"} Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.587169 5008 scope.go:117] "RemoveContainer" containerID="abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.587203 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.618321 5008 scope.go:117] "RemoveContainer" containerID="abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee" Mar 18 18:25:31 crc kubenswrapper[5008]: E0318 18:25:31.618954 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee\": container with ID starting with abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee not found: ID does not exist" containerID="abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.618998 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee"} err="failed to get container status \"abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee\": rpc error: code = NotFound desc = could not find container \"abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee\": container with ID starting with abbcba4b3618e5a61f5319eee8d12ca14a70d169321ac4caec0de0373a4a74ee not found: ID does not exist" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.630646 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.638975 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.667773 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:25:31 crc kubenswrapper[5008]: E0318 18:25:31.668059 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ede22a-8990-49f1-8bb9-f0c08bb3c8b3" containerName="kube-state-metrics" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.668070 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ede22a-8990-49f1-8bb9-f0c08bb3c8b3" containerName="kube-state-metrics" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.668247 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ede22a-8990-49f1-8bb9-f0c08bb3c8b3" containerName="kube-state-metrics" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.668850 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.670425 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.670963 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.684464 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.797266 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.797330 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.797375 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.797629 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz9n2\" (UniqueName: \"kubernetes.io/projected/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-api-access-jz9n2\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.899385 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.899473 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.899530 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.899724 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz9n2\" (UniqueName: \"kubernetes.io/projected/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-api-access-jz9n2\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.907693 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.907789 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.913381 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.922416 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz9n2\" (UniqueName: \"kubernetes.io/projected/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-api-access-jz9n2\") pod \"kube-state-metrics-0\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " pod="openstack/kube-state-metrics-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.969030 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 18:25:31 crc kubenswrapper[5008]: I0318 18:25:31.995140 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.220597 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ede22a-8990-49f1-8bb9-f0c08bb3c8b3" path="/var/lib/kubelet/pods/96ede22a-8990-49f1-8bb9-f0c08bb3c8b3/volumes" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.321138 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.321462 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="ceilometer-central-agent" containerID="cri-o://824ddbd5e53959ad5a073db431e345df2f3e0e1e485d22b5c38e48498c093a6c" gracePeriod=30 Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.321514 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="sg-core" containerID="cri-o://d47b1282daf1f99beac7e27d1625c58e8f607ddda65adc7bfd8f293f808b44c2" gracePeriod=30 Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.321479 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="proxy-httpd" containerID="cri-o://2b0fc7f85c9c320c40ca861c593a3620b70d44e48bb0ed934a67d01439544ee5" gracePeriod=30 Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.321818 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="ceilometer-notification-agent" containerID="cri-o://7d9b1ff3a3c427f915bb19b684cfd611a22229c718fa1588cb7c471b62494d87" gracePeriod=30 Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.459737 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-52mqb"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.461079 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.466823 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.474343 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-52mqb"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.482850 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.514890 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll77p\" (UniqueName: \"kubernetes.io/projected/97b986d5-bdff-4a92-bec9-27511e91dd2b-kube-api-access-ll77p\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.515084 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-scripts\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.515159 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.515301 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-config-data\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.522241 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.598674 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84","Type":"ContainerStarted","Data":"c9ddd22193772321e5bdebad2a1b11d644150a6ef8f1bbece8f5136d63a3e59d"} Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.604193 5008 generic.go:334] "Generic (PLEG): container finished" podID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerID="2b0fc7f85c9c320c40ca861c593a3620b70d44e48bb0ed934a67d01439544ee5" exitCode=0 Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.604231 5008 generic.go:334] "Generic (PLEG): container finished" podID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerID="d47b1282daf1f99beac7e27d1625c58e8f607ddda65adc7bfd8f293f808b44c2" exitCode=2 Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.604250 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerDied","Data":"2b0fc7f85c9c320c40ca861c593a3620b70d44e48bb0ed934a67d01439544ee5"} Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.604272 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerDied","Data":"d47b1282daf1f99beac7e27d1625c58e8f607ddda65adc7bfd8f293f808b44c2"} Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.617003 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-scripts\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.617069 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.617152 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-config-data\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.617191 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll77p\" (UniqueName: \"kubernetes.io/projected/97b986d5-bdff-4a92-bec9-27511e91dd2b-kube-api-access-ll77p\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.626750 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.630872 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-config-data\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.632158 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-scripts\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.647277 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.651216 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll77p\" (UniqueName: \"kubernetes.io/projected/97b986d5-bdff-4a92-bec9-27511e91dd2b-kube-api-access-ll77p\") pod \"nova-cell0-cell-mapping-52mqb\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.661427 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.682620 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.692399 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.704867 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.706676 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.709614 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.720370 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-logs\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.720650 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-897b5\" (UniqueName: \"kubernetes.io/projected/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-kube-api-access-897b5\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.720777 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.720888 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-config-data\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.739993 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.823731 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9lx\" (UniqueName: \"kubernetes.io/projected/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-kube-api-access-rs9lx\") pod \"nova-cell1-novncproxy-0\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.823953 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.824108 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-logs\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.824204 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-897b5\" (UniqueName: \"kubernetes.io/projected/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-kube-api-access-897b5\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.824319 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.824438 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-config-data\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.824526 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.824821 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.825485 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-logs\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.833320 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.834864 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-config-data\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.862850 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-897b5\" (UniqueName: \"kubernetes.io/projected/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-kube-api-access-897b5\") pod \"nova-api-0\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " pod="openstack/nova-api-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.882054 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.883417 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.889953 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.904216 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.910784 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.914133 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.927623 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.928536 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-config-data\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.928621 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9lx\" (UniqueName: \"kubernetes.io/projected/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-kube-api-access-rs9lx\") pod \"nova-cell1-novncproxy-0\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.928644 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.928697 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.928732 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04252a4e-5613-4a4d-b105-148f1db99d7e-logs\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.928755 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.928784 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfsdh\" (UniqueName: \"kubernetes.io/projected/04252a4e-5613-4a4d-b105-148f1db99d7e-kube-api-access-pfsdh\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.946190 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.946266 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.976173 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:32 crc kubenswrapper[5008]: I0318 18:25:32.992610 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9lx\" (UniqueName: \"kubernetes.io/projected/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-kube-api-access-rs9lx\") pod \"nova-cell1-novncproxy-0\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.058923 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-9rfgw"] Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.062111 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.073040 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfsdh\" (UniqueName: \"kubernetes.io/projected/04252a4e-5613-4a4d-b105-148f1db99d7e-kube-api-access-pfsdh\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.073108 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgvmd\" (UniqueName: \"kubernetes.io/projected/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-kube-api-access-wgvmd\") pod \"nova-scheduler-0\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.073137 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-config-data\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.073294 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.073402 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-config-data\") pod \"nova-scheduler-0\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.073471 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.073573 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04252a4e-5613-4a4d-b105-148f1db99d7e-logs\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.077226 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04252a4e-5613-4a4d-b105-148f1db99d7e-logs\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.102377 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.105208 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.105802 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-config-data\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.107174 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.107541 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfsdh\" (UniqueName: \"kubernetes.io/projected/04252a4e-5613-4a4d-b105-148f1db99d7e-kube-api-access-pfsdh\") pod \"nova-metadata-0\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " pod="openstack/nova-metadata-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.115119 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-9rfgw"] Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.178041 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgvmd\" (UniqueName: \"kubernetes.io/projected/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-kube-api-access-wgvmd\") pod \"nova-scheduler-0\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.178113 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.178151 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-config-data\") pod \"nova-scheduler-0\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.182236 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-config-data\") pod \"nova-scheduler-0\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.186320 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.199140 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgvmd\" (UniqueName: \"kubernetes.io/projected/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-kube-api-access-wgvmd\") pod \"nova-scheduler-0\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.279684 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdvfc\" (UniqueName: \"kubernetes.io/projected/0e0ed916-3672-43f4-8045-ceab250a8a6f-kube-api-access-zdvfc\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.279757 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-swift-storage-0\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.279786 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-sb\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.279849 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-config\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.279869 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-svc\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.279925 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-nb\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.380478 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.381371 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdvfc\" (UniqueName: \"kubernetes.io/projected/0e0ed916-3672-43f4-8045-ceab250a8a6f-kube-api-access-zdvfc\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.381805 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-swift-storage-0\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.381856 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-sb\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.381889 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-config\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.382740 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-svc\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.382935 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-nb\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.383036 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-config\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.383056 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-swift-storage-0\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.383084 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-sb\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.383670 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-svc\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.383717 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-nb\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.402190 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdvfc\" (UniqueName: \"kubernetes.io/projected/0e0ed916-3672-43f4-8045-ceab250a8a6f-kube-api-access-zdvfc\") pod \"dnsmasq-dns-7b495b9cc7-9rfgw\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.495739 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.527170 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.589764 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-52mqb"] Mar 18 18:25:33 crc kubenswrapper[5008]: W0318 18:25:33.593494 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97b986d5_bdff_4a92_bec9_27511e91dd2b.slice/crio-a5d2053d11af759e0bfbab81aeafd31a959fedd459b3d365b0520e8012cca02c WatchSource:0}: Error finding container a5d2053d11af759e0bfbab81aeafd31a959fedd459b3d365b0520e8012cca02c: Status 404 returned error can't find the container with id a5d2053d11af759e0bfbab81aeafd31a959fedd459b3d365b0520e8012cca02c Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.654420 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84","Type":"ContainerStarted","Data":"a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603"} Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.655442 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.676709 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-52mqb" event={"ID":"97b986d5-bdff-4a92-bec9-27511e91dd2b","Type":"ContainerStarted","Data":"a5d2053d11af759e0bfbab81aeafd31a959fedd459b3d365b0520e8012cca02c"} Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.683498 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.127005155 podStartE2EDuration="2.683480536s" podCreationTimestamp="2026-03-18 18:25:31 +0000 UTC" firstStartedPulling="2026-03-18 18:25:32.508221607 +0000 UTC m=+1389.027694686" lastFinishedPulling="2026-03-18 18:25:33.064696988 +0000 UTC m=+1389.584170067" observedRunningTime="2026-03-18 18:25:33.674979072 +0000 UTC m=+1390.194452141" watchObservedRunningTime="2026-03-18 18:25:33.683480536 +0000 UTC m=+1390.202953615" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.700846 5008 generic.go:334] "Generic (PLEG): container finished" podID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerID="824ddbd5e53959ad5a073db431e345df2f3e0e1e485d22b5c38e48498c093a6c" exitCode=0 Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.700884 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerDied","Data":"824ddbd5e53959ad5a073db431e345df2f3e0e1e485d22b5c38e48498c093a6c"} Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.719758 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:25:33 crc kubenswrapper[5008]: W0318 18:25:33.732212 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28217cfa_14bc_4fef_bea3_f1ea3a446ee5.slice/crio-33ac2328f32e1cb4d3136f65f38fda1919f7cef8a7bfec58d1b71f6a7a086a2f WatchSource:0}: Error finding container 33ac2328f32e1cb4d3136f65f38fda1919f7cef8a7bfec58d1b71f6a7a086a2f: Status 404 returned error can't find the container with id 33ac2328f32e1cb4d3136f65f38fda1919f7cef8a7bfec58d1b71f6a7a086a2f Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.806478 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.889895 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8lvzk"] Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.891691 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.899098 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.899533 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.906025 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8lvzk"] Mar 18 18:25:33 crc kubenswrapper[5008]: W0318 18:25:33.966235 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04252a4e_5613_4a4d_b105_148f1db99d7e.slice/crio-cd66d3af9889caf5e1dfe56f7c4cb7ae1b66df4762c24b0dd3d898e2c33d8ab5 WatchSource:0}: Error finding container cd66d3af9889caf5e1dfe56f7c4cb7ae1b66df4762c24b0dd3d898e2c33d8ab5: Status 404 returned error can't find the container with id cd66d3af9889caf5e1dfe56f7c4cb7ae1b66df4762c24b0dd3d898e2c33d8ab5 Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.971211 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.992571 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.992938 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-config-data\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.993201 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-scripts\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:33 crc kubenswrapper[5008]: I0318 18:25:33.993352 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-254zp\" (UniqueName: \"kubernetes.io/projected/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-kube-api-access-254zp\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.081220 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.094861 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-scripts\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.095502 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-254zp\" (UniqueName: \"kubernetes.io/projected/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-kube-api-access-254zp\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.095668 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.095749 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-config-data\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.102259 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-config-data\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.107142 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-scripts\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.107431 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.120406 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-254zp\" (UniqueName: \"kubernetes.io/projected/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-kube-api-access-254zp\") pod \"nova-cell1-conductor-db-sync-8lvzk\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: W0318 18:25:34.187206 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e0ed916_3672_43f4_8045_ceab250a8a6f.slice/crio-a8f0353e965c25ecb1a59228a63208b1a904e8ef21b602771bce7b95c0d17b20 WatchSource:0}: Error finding container a8f0353e965c25ecb1a59228a63208b1a904e8ef21b602771bce7b95c0d17b20: Status 404 returned error can't find the container with id a8f0353e965c25ecb1a59228a63208b1a904e8ef21b602771bce7b95c0d17b20 Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.192240 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-9rfgw"] Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.307316 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.715769 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-52mqb" event={"ID":"97b986d5-bdff-4a92-bec9-27511e91dd2b","Type":"ContainerStarted","Data":"8aa1173c358dd95ef278aeaaebc5a33805fabac1f0468c56d4cf68beb6fd7318"} Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.722050 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04252a4e-5613-4a4d-b105-148f1db99d7e","Type":"ContainerStarted","Data":"cd66d3af9889caf5e1dfe56f7c4cb7ae1b66df4762c24b0dd3d898e2c33d8ab5"} Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.724614 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f9106d0-e19e-47c3-b7fb-8903eb6459ab","Type":"ContainerStarted","Data":"05990e61f325f91f0e40df210dfb5ab7ba5d44c72be720f8bda6026fc4174941"} Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.726325 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7714967-8c6f-4eac-8e87-6eb2c1cb754c","Type":"ContainerStarted","Data":"9d2fe1e22f9dd909f3a53dbcc7664e9d830766abef70e7355363974ad07e2672"} Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.727757 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"28217cfa-14bc-4fef-bea3-f1ea3a446ee5","Type":"ContainerStarted","Data":"33ac2328f32e1cb4d3136f65f38fda1919f7cef8a7bfec58d1b71f6a7a086a2f"} Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.731985 5008 generic.go:334] "Generic (PLEG): container finished" podID="0e0ed916-3672-43f4-8045-ceab250a8a6f" containerID="8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d" exitCode=0 Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.732730 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" event={"ID":"0e0ed916-3672-43f4-8045-ceab250a8a6f","Type":"ContainerDied","Data":"8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d"} Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.732778 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" event={"ID":"0e0ed916-3672-43f4-8045-ceab250a8a6f","Type":"ContainerStarted","Data":"a8f0353e965c25ecb1a59228a63208b1a904e8ef21b602771bce7b95c0d17b20"} Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.744449 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-52mqb" podStartSLOduration=2.74443026 podStartE2EDuration="2.74443026s" podCreationTimestamp="2026-03-18 18:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:25:34.73984798 +0000 UTC m=+1391.259321069" watchObservedRunningTime="2026-03-18 18:25:34.74443026 +0000 UTC m=+1391.263903329" Mar 18 18:25:34 crc kubenswrapper[5008]: I0318 18:25:34.762748 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8lvzk"] Mar 18 18:25:34 crc kubenswrapper[5008]: W0318 18:25:34.770954 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17cfd4aa_d76e_4a7a_a4b1_c772f531ac03.slice/crio-86af3864c8ba6ecc48e9ff3377c0d7b4f35dd245095622a5df6ca3b225dac56a WatchSource:0}: Error finding container 86af3864c8ba6ecc48e9ff3377c0d7b4f35dd245095622a5df6ca3b225dac56a: Status 404 returned error can't find the container with id 86af3864c8ba6ecc48e9ff3377c0d7b4f35dd245095622a5df6ca3b225dac56a Mar 18 18:25:35 crc kubenswrapper[5008]: I0318 18:25:35.754077 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" event={"ID":"0e0ed916-3672-43f4-8045-ceab250a8a6f","Type":"ContainerStarted","Data":"f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1"} Mar 18 18:25:35 crc kubenswrapper[5008]: I0318 18:25:35.754540 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:35 crc kubenswrapper[5008]: I0318 18:25:35.757706 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8lvzk" event={"ID":"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03","Type":"ContainerStarted","Data":"86af3864c8ba6ecc48e9ff3377c0d7b4f35dd245095622a5df6ca3b225dac56a"} Mar 18 18:25:35 crc kubenswrapper[5008]: I0318 18:25:35.777928 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" podStartSLOduration=3.777910483 podStartE2EDuration="3.777910483s" podCreationTimestamp="2026-03-18 18:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:25:35.772688746 +0000 UTC m=+1392.292161825" watchObservedRunningTime="2026-03-18 18:25:35.777910483 +0000 UTC m=+1392.297383562" Mar 18 18:25:36 crc kubenswrapper[5008]: I0318 18:25:36.210070 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:36 crc kubenswrapper[5008]: I0318 18:25:36.221705 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:25:36 crc kubenswrapper[5008]: I0318 18:25:36.768029 5008 generic.go:334] "Generic (PLEG): container finished" podID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerID="7d9b1ff3a3c427f915bb19b684cfd611a22229c718fa1588cb7c471b62494d87" exitCode=0 Mar 18 18:25:36 crc kubenswrapper[5008]: I0318 18:25:36.768944 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerDied","Data":"7d9b1ff3a3c427f915bb19b684cfd611a22229c718fa1588cb7c471b62494d87"} Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.029732 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.108813 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-run-httpd\") pod \"3f913711-2d07-4cd4-84f0-a28baffb6b79\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.109117 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-245ff\" (UniqueName: \"kubernetes.io/projected/3f913711-2d07-4cd4-84f0-a28baffb6b79-kube-api-access-245ff\") pod \"3f913711-2d07-4cd4-84f0-a28baffb6b79\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.109247 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-sg-core-conf-yaml\") pod \"3f913711-2d07-4cd4-84f0-a28baffb6b79\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.109278 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-scripts\") pod \"3f913711-2d07-4cd4-84f0-a28baffb6b79\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.109406 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-config-data\") pod \"3f913711-2d07-4cd4-84f0-a28baffb6b79\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.109459 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3f913711-2d07-4cd4-84f0-a28baffb6b79" (UID: "3f913711-2d07-4cd4-84f0-a28baffb6b79"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.109516 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-combined-ca-bundle\") pod \"3f913711-2d07-4cd4-84f0-a28baffb6b79\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.109536 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-log-httpd\") pod \"3f913711-2d07-4cd4-84f0-a28baffb6b79\" (UID: \"3f913711-2d07-4cd4-84f0-a28baffb6b79\") " Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.110129 5008 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.110831 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3f913711-2d07-4cd4-84f0-a28baffb6b79" (UID: "3f913711-2d07-4cd4-84f0-a28baffb6b79"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.131100 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f913711-2d07-4cd4-84f0-a28baffb6b79-kube-api-access-245ff" (OuterVolumeSpecName: "kube-api-access-245ff") pod "3f913711-2d07-4cd4-84f0-a28baffb6b79" (UID: "3f913711-2d07-4cd4-84f0-a28baffb6b79"). InnerVolumeSpecName "kube-api-access-245ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.131262 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-scripts" (OuterVolumeSpecName: "scripts") pod "3f913711-2d07-4cd4-84f0-a28baffb6b79" (UID: "3f913711-2d07-4cd4-84f0-a28baffb6b79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.153870 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3f913711-2d07-4cd4-84f0-a28baffb6b79" (UID: "3f913711-2d07-4cd4-84f0-a28baffb6b79"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.212087 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-245ff\" (UniqueName: \"kubernetes.io/projected/3f913711-2d07-4cd4-84f0-a28baffb6b79-kube-api-access-245ff\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.212119 5008 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.212128 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.212138 5008 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f913711-2d07-4cd4-84f0-a28baffb6b79-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.226224 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f913711-2d07-4cd4-84f0-a28baffb6b79" (UID: "3f913711-2d07-4cd4-84f0-a28baffb6b79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.286610 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-config-data" (OuterVolumeSpecName: "config-data") pod "3f913711-2d07-4cd4-84f0-a28baffb6b79" (UID: "3f913711-2d07-4cd4-84f0-a28baffb6b79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.314288 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.314618 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f913711-2d07-4cd4-84f0-a28baffb6b79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.830926 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04252a4e-5613-4a4d-b105-148f1db99d7e","Type":"ContainerStarted","Data":"1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06"} Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.830971 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04252a4e-5613-4a4d-b105-148f1db99d7e","Type":"ContainerStarted","Data":"47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7"} Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.830993 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerName="nova-metadata-log" containerID="cri-o://47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7" gracePeriod=30 Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.831087 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerName="nova-metadata-metadata" containerID="cri-o://1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06" gracePeriod=30 Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.845622 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8lvzk" event={"ID":"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03","Type":"ContainerStarted","Data":"8b7b60b95877bad7e80482410e97cfbd1d3a6413902243c890181903304f480d"} Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.857096 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7714967-8c6f-4eac-8e87-6eb2c1cb754c","Type":"ContainerStarted","Data":"b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8"} Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.857144 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7714967-8c6f-4eac-8e87-6eb2c1cb754c","Type":"ContainerStarted","Data":"f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304"} Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.862093 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.043519234 podStartE2EDuration="5.862073899s" podCreationTimestamp="2026-03-18 18:25:32 +0000 UTC" firstStartedPulling="2026-03-18 18:25:33.969598658 +0000 UTC m=+1390.489071737" lastFinishedPulling="2026-03-18 18:25:36.788153323 +0000 UTC m=+1393.307626402" observedRunningTime="2026-03-18 18:25:37.855068035 +0000 UTC m=+1394.374541124" watchObservedRunningTime="2026-03-18 18:25:37.862073899 +0000 UTC m=+1394.381546978" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.864543 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f913711-2d07-4cd4-84f0-a28baffb6b79","Type":"ContainerDied","Data":"07ebb703e28bf764f63d40b4133e5c9080bbf1dc7a099b7659e15a7901f546d8"} Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.864603 5008 scope.go:117] "RemoveContainer" containerID="2b0fc7f85c9c320c40ca861c593a3620b70d44e48bb0ed934a67d01439544ee5" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.864726 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.870608 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f9106d0-e19e-47c3-b7fb-8903eb6459ab","Type":"ContainerStarted","Data":"3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a"} Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.877806 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8lvzk" podStartSLOduration=4.877788882 podStartE2EDuration="4.877788882s" podCreationTimestamp="2026-03-18 18:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:25:37.872903004 +0000 UTC m=+1394.392376083" watchObservedRunningTime="2026-03-18 18:25:37.877788882 +0000 UTC m=+1394.397261961" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.878336 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"28217cfa-14bc-4fef-bea3-f1ea3a446ee5","Type":"ContainerStarted","Data":"59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680"} Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.878471 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="28217cfa-14bc-4fef-bea3-f1ea3a446ee5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680" gracePeriod=30 Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.892144 5008 scope.go:117] "RemoveContainer" containerID="d47b1282daf1f99beac7e27d1625c58e8f607ddda65adc7bfd8f293f808b44c2" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.910717 5008 scope.go:117] "RemoveContainer" containerID="7d9b1ff3a3c427f915bb19b684cfd611a22229c718fa1588cb7c471b62494d87" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.912143 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.20616208 podStartE2EDuration="5.912111495s" podCreationTimestamp="2026-03-18 18:25:32 +0000 UTC" firstStartedPulling="2026-03-18 18:25:34.08149888 +0000 UTC m=+1390.600971949" lastFinishedPulling="2026-03-18 18:25:36.787448275 +0000 UTC m=+1393.306921364" observedRunningTime="2026-03-18 18:25:37.902783749 +0000 UTC m=+1394.422256838" watchObservedRunningTime="2026-03-18 18:25:37.912111495 +0000 UTC m=+1394.431584584" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.927846 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.966062858 podStartE2EDuration="5.927827278s" podCreationTimestamp="2026-03-18 18:25:32 +0000 UTC" firstStartedPulling="2026-03-18 18:25:33.830959033 +0000 UTC m=+1390.350432112" lastFinishedPulling="2026-03-18 18:25:36.792723453 +0000 UTC m=+1393.312196532" observedRunningTime="2026-03-18 18:25:37.917327612 +0000 UTC m=+1394.436800711" watchObservedRunningTime="2026-03-18 18:25:37.927827278 +0000 UTC m=+1394.447300357" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.933112 5008 scope.go:117] "RemoveContainer" containerID="824ddbd5e53959ad5a073db431e345df2f3e0e1e485d22b5c38e48498c093a6c" Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.978020 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:25:37 crc kubenswrapper[5008]: I0318 18:25:37.996590 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.003322 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:25:38 crc kubenswrapper[5008]: E0318 18:25:38.003765 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="sg-core" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.003785 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="sg-core" Mar 18 18:25:38 crc kubenswrapper[5008]: E0318 18:25:38.003798 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="ceilometer-notification-agent" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.003805 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="ceilometer-notification-agent" Mar 18 18:25:38 crc kubenswrapper[5008]: E0318 18:25:38.003811 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="proxy-httpd" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.003818 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="proxy-httpd" Mar 18 18:25:38 crc kubenswrapper[5008]: E0318 18:25:38.003843 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="ceilometer-central-agent" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.003849 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="ceilometer-central-agent" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.004016 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="proxy-httpd" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.004033 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="ceilometer-central-agent" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.004042 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="sg-core" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.004054 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" containerName="ceilometer-notification-agent" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.005690 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.006430 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.9844360500000002 podStartE2EDuration="6.006358882s" podCreationTimestamp="2026-03-18 18:25:32 +0000 UTC" firstStartedPulling="2026-03-18 18:25:33.742405145 +0000 UTC m=+1390.261878224" lastFinishedPulling="2026-03-18 18:25:36.764327977 +0000 UTC m=+1393.283801056" observedRunningTime="2026-03-18 18:25:37.961965595 +0000 UTC m=+1394.481438684" watchObservedRunningTime="2026-03-18 18:25:38.006358882 +0000 UTC m=+1394.525831961" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.008024 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.008226 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.010071 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.019022 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.038373 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-log-httpd\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.038442 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqmph\" (UniqueName: \"kubernetes.io/projected/719c54e1-6e5f-4769-94b0-c3f329fcf966-kube-api-access-tqmph\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.038500 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-config-data\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.038526 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-run-httpd\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.038543 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.038611 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-scripts\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.038652 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.038689 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.103369 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.140166 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-scripts\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.140236 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.140282 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.140348 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-log-httpd\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.140399 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqmph\" (UniqueName: \"kubernetes.io/projected/719c54e1-6e5f-4769-94b0-c3f329fcf966-kube-api-access-tqmph\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.140432 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-config-data\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.140463 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-run-httpd\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.140504 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.145007 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-log-httpd\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.147991 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-run-httpd\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.151942 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.152507 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-scripts\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.166584 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqmph\" (UniqueName: \"kubernetes.io/projected/719c54e1-6e5f-4769-94b0-c3f329fcf966-kube-api-access-tqmph\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.178239 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.179098 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-config-data\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.191429 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.234987 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f913711-2d07-4cd4-84f0-a28baffb6b79" path="/var/lib/kubelet/pods/3f913711-2d07-4cd4-84f0-a28baffb6b79/volumes" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.361253 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.425198 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.496684 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.547024 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-config-data\") pod \"04252a4e-5613-4a4d-b105-148f1db99d7e\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.547067 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfsdh\" (UniqueName: \"kubernetes.io/projected/04252a4e-5613-4a4d-b105-148f1db99d7e-kube-api-access-pfsdh\") pod \"04252a4e-5613-4a4d-b105-148f1db99d7e\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.547134 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04252a4e-5613-4a4d-b105-148f1db99d7e-logs\") pod \"04252a4e-5613-4a4d-b105-148f1db99d7e\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.547192 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-combined-ca-bundle\") pod \"04252a4e-5613-4a4d-b105-148f1db99d7e\" (UID: \"04252a4e-5613-4a4d-b105-148f1db99d7e\") " Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.547745 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04252a4e-5613-4a4d-b105-148f1db99d7e-logs" (OuterVolumeSpecName: "logs") pod "04252a4e-5613-4a4d-b105-148f1db99d7e" (UID: "04252a4e-5613-4a4d-b105-148f1db99d7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.548072 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04252a4e-5613-4a4d-b105-148f1db99d7e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.551691 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04252a4e-5613-4a4d-b105-148f1db99d7e-kube-api-access-pfsdh" (OuterVolumeSpecName: "kube-api-access-pfsdh") pod "04252a4e-5613-4a4d-b105-148f1db99d7e" (UID: "04252a4e-5613-4a4d-b105-148f1db99d7e"). InnerVolumeSpecName "kube-api-access-pfsdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.578830 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04252a4e-5613-4a4d-b105-148f1db99d7e" (UID: "04252a4e-5613-4a4d-b105-148f1db99d7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.591649 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-config-data" (OuterVolumeSpecName: "config-data") pod "04252a4e-5613-4a4d-b105-148f1db99d7e" (UID: "04252a4e-5613-4a4d-b105-148f1db99d7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.649464 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.649489 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfsdh\" (UniqueName: \"kubernetes.io/projected/04252a4e-5613-4a4d-b105-148f1db99d7e-kube-api-access-pfsdh\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.649499 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04252a4e-5613-4a4d-b105-148f1db99d7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.906947 5008 generic.go:334] "Generic (PLEG): container finished" podID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerID="1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06" exitCode=0 Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.907298 5008 generic.go:334] "Generic (PLEG): container finished" podID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerID="47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7" exitCode=143 Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.907004 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04252a4e-5613-4a4d-b105-148f1db99d7e","Type":"ContainerDied","Data":"1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06"} Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.907417 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04252a4e-5613-4a4d-b105-148f1db99d7e","Type":"ContainerDied","Data":"47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7"} Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.907443 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04252a4e-5613-4a4d-b105-148f1db99d7e","Type":"ContainerDied","Data":"cd66d3af9889caf5e1dfe56f7c4cb7ae1b66df4762c24b0dd3d898e2c33d8ab5"} Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.907465 5008 scope.go:117] "RemoveContainer" containerID="1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.907091 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.931058 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.946926 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.951723 5008 scope.go:117] "RemoveContainer" containerID="47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7" Mar 18 18:25:38 crc kubenswrapper[5008]: I0318 18:25:38.981769 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.006178 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:39 crc kubenswrapper[5008]: E0318 18:25:39.007101 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerName="nova-metadata-metadata" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.007125 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerName="nova-metadata-metadata" Mar 18 18:25:39 crc kubenswrapper[5008]: E0318 18:25:39.007145 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerName="nova-metadata-log" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.007153 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerName="nova-metadata-log" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.007446 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerName="nova-metadata-metadata" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.007488 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="04252a4e-5613-4a4d-b105-148f1db99d7e" containerName="nova-metadata-log" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.008995 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.010906 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.011136 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.016634 5008 scope.go:117] "RemoveContainer" containerID="1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06" Mar 18 18:25:39 crc kubenswrapper[5008]: E0318 18:25:39.017039 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06\": container with ID starting with 1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06 not found: ID does not exist" containerID="1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.017065 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06"} err="failed to get container status \"1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06\": rpc error: code = NotFound desc = could not find container \"1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06\": container with ID starting with 1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06 not found: ID does not exist" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.017083 5008 scope.go:117] "RemoveContainer" containerID="47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7" Mar 18 18:25:39 crc kubenswrapper[5008]: E0318 18:25:39.017526 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7\": container with ID starting with 47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7 not found: ID does not exist" containerID="47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.017546 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7"} err="failed to get container status \"47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7\": rpc error: code = NotFound desc = could not find container \"47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7\": container with ID starting with 47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7 not found: ID does not exist" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.017579 5008 scope.go:117] "RemoveContainer" containerID="1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.017961 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06"} err="failed to get container status \"1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06\": rpc error: code = NotFound desc = could not find container \"1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06\": container with ID starting with 1b4fd09dcde693c9050b1d152e576b5b78c29d5f1c91435fa2cc21365c850f06 not found: ID does not exist" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.018002 5008 scope.go:117] "RemoveContainer" containerID="47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.018513 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7"} err="failed to get container status \"47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7\": rpc error: code = NotFound desc = could not find container \"47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7\": container with ID starting with 47a527e0199050b149ea25adad896beeb8fba5b4010e93872ca5d7f7641c44a7 not found: ID does not exist" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.020747 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.160294 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9hft\" (UniqueName: \"kubernetes.io/projected/361bf9a3-1d04-4606-b978-9cc654172bab-kube-api-access-f9hft\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.160525 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.160566 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361bf9a3-1d04-4606-b978-9cc654172bab-logs\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.160648 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-config-data\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.160690 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.262579 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-config-data\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.262673 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.262730 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hft\" (UniqueName: \"kubernetes.io/projected/361bf9a3-1d04-4606-b978-9cc654172bab-kube-api-access-f9hft\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.262780 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.262820 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361bf9a3-1d04-4606-b978-9cc654172bab-logs\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.263418 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361bf9a3-1d04-4606-b978-9cc654172bab-logs\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.271276 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.272792 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-config-data\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.275426 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.292034 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9hft\" (UniqueName: \"kubernetes.io/projected/361bf9a3-1d04-4606-b978-9cc654172bab-kube-api-access-f9hft\") pod \"nova-metadata-0\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.336170 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:39 crc kubenswrapper[5008]: W0318 18:25:39.819156 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod361bf9a3_1d04_4606_b978_9cc654172bab.slice/crio-57ba454f625190db79ef81266b3022de3c8660596f778605ff5905ea15404299 WatchSource:0}: Error finding container 57ba454f625190db79ef81266b3022de3c8660596f778605ff5905ea15404299: Status 404 returned error can't find the container with id 57ba454f625190db79ef81266b3022de3c8660596f778605ff5905ea15404299 Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.831227 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.921127 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"361bf9a3-1d04-4606-b978-9cc654172bab","Type":"ContainerStarted","Data":"57ba454f625190db79ef81266b3022de3c8660596f778605ff5905ea15404299"} Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.925847 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerStarted","Data":"05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7"} Mar 18 18:25:39 crc kubenswrapper[5008]: I0318 18:25:39.925889 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerStarted","Data":"7ea9ee2a8ab3901a9fb2bfa9c46c452c9801b18a71c52e8756907367866229d7"} Mar 18 18:25:40 crc kubenswrapper[5008]: I0318 18:25:40.209468 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04252a4e-5613-4a4d-b105-148f1db99d7e" path="/var/lib/kubelet/pods/04252a4e-5613-4a4d-b105-148f1db99d7e/volumes" Mar 18 18:25:40 crc kubenswrapper[5008]: I0318 18:25:40.933384 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerStarted","Data":"c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676"} Mar 18 18:25:40 crc kubenswrapper[5008]: I0318 18:25:40.934988 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"361bf9a3-1d04-4606-b978-9cc654172bab","Type":"ContainerStarted","Data":"3f9c8b36e3bfc7982a3be42688a05db61a00f36de8b934ad514229c5f5188611"} Mar 18 18:25:40 crc kubenswrapper[5008]: I0318 18:25:40.935009 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"361bf9a3-1d04-4606-b978-9cc654172bab","Type":"ContainerStarted","Data":"41f033a6fee7af6f608a13cb7eea2a28d665070f761e75f9ebb38d2de64f943d"} Mar 18 18:25:40 crc kubenswrapper[5008]: I0318 18:25:40.954295 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.954278158 podStartE2EDuration="2.954278158s" podCreationTimestamp="2026-03-18 18:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:25:40.953472577 +0000 UTC m=+1397.472945696" watchObservedRunningTime="2026-03-18 18:25:40.954278158 +0000 UTC m=+1397.473751237" Mar 18 18:25:41 crc kubenswrapper[5008]: I0318 18:25:41.945966 5008 generic.go:334] "Generic (PLEG): container finished" podID="97b986d5-bdff-4a92-bec9-27511e91dd2b" containerID="8aa1173c358dd95ef278aeaaebc5a33805fabac1f0468c56d4cf68beb6fd7318" exitCode=0 Mar 18 18:25:41 crc kubenswrapper[5008]: I0318 18:25:41.946024 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-52mqb" event={"ID":"97b986d5-bdff-4a92-bec9-27511e91dd2b","Type":"ContainerDied","Data":"8aa1173c358dd95ef278aeaaebc5a33805fabac1f0468c56d4cf68beb6fd7318"} Mar 18 18:25:41 crc kubenswrapper[5008]: I0318 18:25:41.948484 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerStarted","Data":"bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583"} Mar 18 18:25:42 crc kubenswrapper[5008]: I0318 18:25:42.009324 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.064994 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.066790 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.397149 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.496015 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.524248 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.528668 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.549608 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-combined-ca-bundle\") pod \"97b986d5-bdff-4a92-bec9-27511e91dd2b\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.549741 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-config-data\") pod \"97b986d5-bdff-4a92-bec9-27511e91dd2b\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.550621 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll77p\" (UniqueName: \"kubernetes.io/projected/97b986d5-bdff-4a92-bec9-27511e91dd2b-kube-api-access-ll77p\") pod \"97b986d5-bdff-4a92-bec9-27511e91dd2b\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.550652 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-scripts\") pod \"97b986d5-bdff-4a92-bec9-27511e91dd2b\" (UID: \"97b986d5-bdff-4a92-bec9-27511e91dd2b\") " Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.568217 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b986d5-bdff-4a92-bec9-27511e91dd2b-kube-api-access-ll77p" (OuterVolumeSpecName: "kube-api-access-ll77p") pod "97b986d5-bdff-4a92-bec9-27511e91dd2b" (UID: "97b986d5-bdff-4a92-bec9-27511e91dd2b"). InnerVolumeSpecName "kube-api-access-ll77p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.580658 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-scripts" (OuterVolumeSpecName: "scripts") pod "97b986d5-bdff-4a92-bec9-27511e91dd2b" (UID: "97b986d5-bdff-4a92-bec9-27511e91dd2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.611304 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97b986d5-bdff-4a92-bec9-27511e91dd2b" (UID: "97b986d5-bdff-4a92-bec9-27511e91dd2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.631217 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-config-data" (OuterVolumeSpecName: "config-data") pod "97b986d5-bdff-4a92-bec9-27511e91dd2b" (UID: "97b986d5-bdff-4a92-bec9-27511e91dd2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.643916 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-snzkk"] Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.644222 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" podUID="fe68e762-c3c6-46d4-a897-c254828dd808" containerName="dnsmasq-dns" containerID="cri-o://a8c0850c31f1fbc0a4c6c047087e854ab6daf0542dc068d89a43baf1fc032648" gracePeriod=10 Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.656331 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll77p\" (UniqueName: \"kubernetes.io/projected/97b986d5-bdff-4a92-bec9-27511e91dd2b-kube-api-access-ll77p\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.656696 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.656850 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.656996 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97b986d5-bdff-4a92-bec9-27511e91dd2b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.974500 5008 generic.go:334] "Generic (PLEG): container finished" podID="fe68e762-c3c6-46d4-a897-c254828dd808" containerID="a8c0850c31f1fbc0a4c6c047087e854ab6daf0542dc068d89a43baf1fc032648" exitCode=0 Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.974599 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" event={"ID":"fe68e762-c3c6-46d4-a897-c254828dd808","Type":"ContainerDied","Data":"a8c0850c31f1fbc0a4c6c047087e854ab6daf0542dc068d89a43baf1fc032648"} Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.984845 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-52mqb" event={"ID":"97b986d5-bdff-4a92-bec9-27511e91dd2b","Type":"ContainerDied","Data":"a5d2053d11af759e0bfbab81aeafd31a959fedd459b3d365b0520e8012cca02c"} Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.984912 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5d2053d11af759e0bfbab81aeafd31a959fedd459b3d365b0520e8012cca02c" Mar 18 18:25:43 crc kubenswrapper[5008]: I0318 18:25:43.984997 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-52mqb" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.039038 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.068665 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.153011 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.153748 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.167108 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-config\") pod \"fe68e762-c3c6-46d4-a897-c254828dd808\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.167144 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-swift-storage-0\") pod \"fe68e762-c3c6-46d4-a897-c254828dd808\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.167245 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-svc\") pod \"fe68e762-c3c6-46d4-a897-c254828dd808\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.167296 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-nb\") pod \"fe68e762-c3c6-46d4-a897-c254828dd808\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.167335 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8cx6\" (UniqueName: \"kubernetes.io/projected/fe68e762-c3c6-46d4-a897-c254828dd808-kube-api-access-m8cx6\") pod \"fe68e762-c3c6-46d4-a897-c254828dd808\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.167388 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-sb\") pod \"fe68e762-c3c6-46d4-a897-c254828dd808\" (UID: \"fe68e762-c3c6-46d4-a897-c254828dd808\") " Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.173843 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe68e762-c3c6-46d4-a897-c254828dd808-kube-api-access-m8cx6" (OuterVolumeSpecName: "kube-api-access-m8cx6") pod "fe68e762-c3c6-46d4-a897-c254828dd808" (UID: "fe68e762-c3c6-46d4-a897-c254828dd808"). InnerVolumeSpecName "kube-api-access-m8cx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.222645 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fe68e762-c3c6-46d4-a897-c254828dd808" (UID: "fe68e762-c3c6-46d4-a897-c254828dd808"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.222655 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe68e762-c3c6-46d4-a897-c254828dd808" (UID: "fe68e762-c3c6-46d4-a897-c254828dd808"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.223288 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe68e762-c3c6-46d4-a897-c254828dd808" (UID: "fe68e762-c3c6-46d4-a897-c254828dd808"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.241414 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-config" (OuterVolumeSpecName: "config") pod "fe68e762-c3c6-46d4-a897-c254828dd808" (UID: "fe68e762-c3c6-46d4-a897-c254828dd808"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.256803 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.274391 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.274427 5008 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.274448 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.274459 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.274467 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8cx6\" (UniqueName: \"kubernetes.io/projected/fe68e762-c3c6-46d4-a897-c254828dd808-kube-api-access-m8cx6\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.292952 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.293276 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="361bf9a3-1d04-4606-b978-9cc654172bab" containerName="nova-metadata-log" containerID="cri-o://41f033a6fee7af6f608a13cb7eea2a28d665070f761e75f9ebb38d2de64f943d" gracePeriod=30 Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.293486 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="361bf9a3-1d04-4606-b978-9cc654172bab" containerName="nova-metadata-metadata" containerID="cri-o://3f9c8b36e3bfc7982a3be42688a05db61a00f36de8b934ad514229c5f5188611" gracePeriod=30 Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.294036 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe68e762-c3c6-46d4-a897-c254828dd808" (UID: "fe68e762-c3c6-46d4-a897-c254828dd808"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.376525 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe68e762-c3c6-46d4-a897-c254828dd808-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:44 crc kubenswrapper[5008]: I0318 18:25:44.568064 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.003367 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" event={"ID":"fe68e762-c3c6-46d4-a897-c254828dd808","Type":"ContainerDied","Data":"f31a77ff8c99f25b89a5ee6bcdc603c0567df4c8db1528ba553951ba6fd64a9b"} Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.003834 5008 scope.go:117] "RemoveContainer" containerID="a8c0850c31f1fbc0a4c6c047087e854ab6daf0542dc068d89a43baf1fc032648" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.003399 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8995fbb57-snzkk" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.005401 5008 generic.go:334] "Generic (PLEG): container finished" podID="17cfd4aa-d76e-4a7a-a4b1-c772f531ac03" containerID="8b7b60b95877bad7e80482410e97cfbd1d3a6413902243c890181903304f480d" exitCode=0 Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.005462 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8lvzk" event={"ID":"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03","Type":"ContainerDied","Data":"8b7b60b95877bad7e80482410e97cfbd1d3a6413902243c890181903304f480d"} Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.017220 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerStarted","Data":"c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37"} Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.017934 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.020587 5008 generic.go:334] "Generic (PLEG): container finished" podID="361bf9a3-1d04-4606-b978-9cc654172bab" containerID="3f9c8b36e3bfc7982a3be42688a05db61a00f36de8b934ad514229c5f5188611" exitCode=0 Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.020611 5008 generic.go:334] "Generic (PLEG): container finished" podID="361bf9a3-1d04-4606-b978-9cc654172bab" containerID="41f033a6fee7af6f608a13cb7eea2a28d665070f761e75f9ebb38d2de64f943d" exitCode=143 Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.020727 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"361bf9a3-1d04-4606-b978-9cc654172bab","Type":"ContainerDied","Data":"3f9c8b36e3bfc7982a3be42688a05db61a00f36de8b934ad514229c5f5188611"} Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.020778 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-log" containerID="cri-o://f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304" gracePeriod=30 Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.020790 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"361bf9a3-1d04-4606-b978-9cc654172bab","Type":"ContainerDied","Data":"41f033a6fee7af6f608a13cb7eea2a28d665070f761e75f9ebb38d2de64f943d"} Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.021001 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-api" containerID="cri-o://b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8" gracePeriod=30 Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.051094 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.059663 5008 scope.go:117] "RemoveContainer" containerID="e42d030eac9288a107b6bca15738bed5ae0d85520b780e690d3fa617e74d46cc" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.074359 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.219302005 podStartE2EDuration="8.074339922s" podCreationTimestamp="2026-03-18 18:25:37 +0000 UTC" firstStartedPulling="2026-03-18 18:25:38.967915824 +0000 UTC m=+1395.487388903" lastFinishedPulling="2026-03-18 18:25:43.822953751 +0000 UTC m=+1400.342426820" observedRunningTime="2026-03-18 18:25:45.067774729 +0000 UTC m=+1401.587247808" watchObservedRunningTime="2026-03-18 18:25:45.074339922 +0000 UTC m=+1401.593812991" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.092357 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9hft\" (UniqueName: \"kubernetes.io/projected/361bf9a3-1d04-4606-b978-9cc654172bab-kube-api-access-f9hft\") pod \"361bf9a3-1d04-4606-b978-9cc654172bab\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.092701 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-combined-ca-bundle\") pod \"361bf9a3-1d04-4606-b978-9cc654172bab\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.092886 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361bf9a3-1d04-4606-b978-9cc654172bab-logs\") pod \"361bf9a3-1d04-4606-b978-9cc654172bab\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.093045 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-nova-metadata-tls-certs\") pod \"361bf9a3-1d04-4606-b978-9cc654172bab\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.093174 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-config-data\") pod \"361bf9a3-1d04-4606-b978-9cc654172bab\" (UID: \"361bf9a3-1d04-4606-b978-9cc654172bab\") " Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.093211 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/361bf9a3-1d04-4606-b978-9cc654172bab-logs" (OuterVolumeSpecName: "logs") pod "361bf9a3-1d04-4606-b978-9cc654172bab" (UID: "361bf9a3-1d04-4606-b978-9cc654172bab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.098508 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361bf9a3-1d04-4606-b978-9cc654172bab-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.104088 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361bf9a3-1d04-4606-b978-9cc654172bab-kube-api-access-f9hft" (OuterVolumeSpecName: "kube-api-access-f9hft") pod "361bf9a3-1d04-4606-b978-9cc654172bab" (UID: "361bf9a3-1d04-4606-b978-9cc654172bab"). InnerVolumeSpecName "kube-api-access-f9hft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.121600 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-snzkk"] Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.130491 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8995fbb57-snzkk"] Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.132888 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "361bf9a3-1d04-4606-b978-9cc654172bab" (UID: "361bf9a3-1d04-4606-b978-9cc654172bab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.140654 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-config-data" (OuterVolumeSpecName: "config-data") pod "361bf9a3-1d04-4606-b978-9cc654172bab" (UID: "361bf9a3-1d04-4606-b978-9cc654172bab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.159704 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "361bf9a3-1d04-4606-b978-9cc654172bab" (UID: "361bf9a3-1d04-4606-b978-9cc654172bab"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.199625 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9hft\" (UniqueName: \"kubernetes.io/projected/361bf9a3-1d04-4606-b978-9cc654172bab-kube-api-access-f9hft\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.199662 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.199675 5008 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:45 crc kubenswrapper[5008]: I0318 18:25:45.199686 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361bf9a3-1d04-4606-b978-9cc654172bab-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.035720 5008 generic.go:334] "Generic (PLEG): container finished" podID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerID="f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304" exitCode=143 Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.035821 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7714967-8c6f-4eac-8e87-6eb2c1cb754c","Type":"ContainerDied","Data":"f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304"} Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.038467 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"361bf9a3-1d04-4606-b978-9cc654172bab","Type":"ContainerDied","Data":"57ba454f625190db79ef81266b3022de3c8660596f778605ff5905ea15404299"} Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.038510 5008 scope.go:117] "RemoveContainer" containerID="3f9c8b36e3bfc7982a3be42688a05db61a00f36de8b934ad514229c5f5188611" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.038528 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.044461 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9f9106d0-e19e-47c3-b7fb-8903eb6459ab" containerName="nova-scheduler-scheduler" containerID="cri-o://3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a" gracePeriod=30 Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.080737 5008 scope.go:117] "RemoveContainer" containerID="41f033a6fee7af6f608a13cb7eea2a28d665070f761e75f9ebb38d2de64f943d" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.120433 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.135995 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.154999 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:46 crc kubenswrapper[5008]: E0318 18:25:46.155543 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe68e762-c3c6-46d4-a897-c254828dd808" containerName="init" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.155686 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe68e762-c3c6-46d4-a897-c254828dd808" containerName="init" Mar 18 18:25:46 crc kubenswrapper[5008]: E0318 18:25:46.155767 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361bf9a3-1d04-4606-b978-9cc654172bab" containerName="nova-metadata-metadata" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.155821 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="361bf9a3-1d04-4606-b978-9cc654172bab" containerName="nova-metadata-metadata" Mar 18 18:25:46 crc kubenswrapper[5008]: E0318 18:25:46.155888 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe68e762-c3c6-46d4-a897-c254828dd808" containerName="dnsmasq-dns" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.155946 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe68e762-c3c6-46d4-a897-c254828dd808" containerName="dnsmasq-dns" Mar 18 18:25:46 crc kubenswrapper[5008]: E0318 18:25:46.156016 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b986d5-bdff-4a92-bec9-27511e91dd2b" containerName="nova-manage" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.156067 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b986d5-bdff-4a92-bec9-27511e91dd2b" containerName="nova-manage" Mar 18 18:25:46 crc kubenswrapper[5008]: E0318 18:25:46.156128 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361bf9a3-1d04-4606-b978-9cc654172bab" containerName="nova-metadata-log" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.156177 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="361bf9a3-1d04-4606-b978-9cc654172bab" containerName="nova-metadata-log" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.156404 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b986d5-bdff-4a92-bec9-27511e91dd2b" containerName="nova-manage" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.156482 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe68e762-c3c6-46d4-a897-c254828dd808" containerName="dnsmasq-dns" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.156537 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="361bf9a3-1d04-4606-b978-9cc654172bab" containerName="nova-metadata-metadata" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.156614 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="361bf9a3-1d04-4606-b978-9cc654172bab" containerName="nova-metadata-log" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.157579 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.161834 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.162321 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.168152 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.210233 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361bf9a3-1d04-4606-b978-9cc654172bab" path="/var/lib/kubelet/pods/361bf9a3-1d04-4606-b978-9cc654172bab/volumes" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.211192 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe68e762-c3c6-46d4-a897-c254828dd808" path="/var/lib/kubelet/pods/fe68e762-c3c6-46d4-a897-c254828dd808/volumes" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.216532 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-logs\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.216611 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.216637 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.216694 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4kq\" (UniqueName: \"kubernetes.io/projected/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-kube-api-access-4x4kq\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.216808 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-config-data\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.318524 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-logs\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.318900 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.318922 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.318955 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4kq\" (UniqueName: \"kubernetes.io/projected/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-kube-api-access-4x4kq\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.319010 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-config-data\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.321736 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-logs\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.325064 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.336261 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-config-data\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.339256 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4kq\" (UniqueName: \"kubernetes.io/projected/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-kube-api-access-4x4kq\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.343231 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.430138 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.486509 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.521987 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-combined-ca-bundle\") pod \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.522127 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-254zp\" (UniqueName: \"kubernetes.io/projected/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-kube-api-access-254zp\") pod \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.522183 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-config-data\") pod \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.522225 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-scripts\") pod \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\" (UID: \"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03\") " Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.531720 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-scripts" (OuterVolumeSpecName: "scripts") pod "17cfd4aa-d76e-4a7a-a4b1-c772f531ac03" (UID: "17cfd4aa-d76e-4a7a-a4b1-c772f531ac03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.531762 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-kube-api-access-254zp" (OuterVolumeSpecName: "kube-api-access-254zp") pod "17cfd4aa-d76e-4a7a-a4b1-c772f531ac03" (UID: "17cfd4aa-d76e-4a7a-a4b1-c772f531ac03"). InnerVolumeSpecName "kube-api-access-254zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.556729 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-config-data" (OuterVolumeSpecName: "config-data") pod "17cfd4aa-d76e-4a7a-a4b1-c772f531ac03" (UID: "17cfd4aa-d76e-4a7a-a4b1-c772f531ac03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.558820 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17cfd4aa-d76e-4a7a-a4b1-c772f531ac03" (UID: "17cfd4aa-d76e-4a7a-a4b1-c772f531ac03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.623775 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.624198 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.624211 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.624223 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-254zp\" (UniqueName: \"kubernetes.io/projected/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03-kube-api-access-254zp\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:46 crc kubenswrapper[5008]: I0318 18:25:46.770236 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:25:46 crc kubenswrapper[5008]: W0318 18:25:46.786669 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff482a80_2d35_4ec7_a524_67c8bbf33b5f.slice/crio-4447c98e6b6909125a900de89d1e62293b6868faa745aabfa1b0eda5c388ab79 WatchSource:0}: Error finding container 4447c98e6b6909125a900de89d1e62293b6868faa745aabfa1b0eda5c388ab79: Status 404 returned error can't find the container with id 4447c98e6b6909125a900de89d1e62293b6868faa745aabfa1b0eda5c388ab79 Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.075855 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff482a80-2d35-4ec7-a524-67c8bbf33b5f","Type":"ContainerStarted","Data":"91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617"} Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.075905 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff482a80-2d35-4ec7-a524-67c8bbf33b5f","Type":"ContainerStarted","Data":"4447c98e6b6909125a900de89d1e62293b6868faa745aabfa1b0eda5c388ab79"} Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.081181 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8lvzk" event={"ID":"17cfd4aa-d76e-4a7a-a4b1-c772f531ac03","Type":"ContainerDied","Data":"86af3864c8ba6ecc48e9ff3377c0d7b4f35dd245095622a5df6ca3b225dac56a"} Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.081222 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86af3864c8ba6ecc48e9ff3377c0d7b4f35dd245095622a5df6ca3b225dac56a" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.081244 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8lvzk" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.132619 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:25:47 crc kubenswrapper[5008]: E0318 18:25:47.133534 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cfd4aa-d76e-4a7a-a4b1-c772f531ac03" containerName="nova-cell1-conductor-db-sync" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.133649 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cfd4aa-d76e-4a7a-a4b1-c772f531ac03" containerName="nova-cell1-conductor-db-sync" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.133971 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="17cfd4aa-d76e-4a7a-a4b1-c772f531ac03" containerName="nova-cell1-conductor-db-sync" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.134832 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.139427 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.145357 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.336847 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.336967 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.337089 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5pt7\" (UniqueName: \"kubernetes.io/projected/dd462bb4-44f5-4e0f-bc17-53d24604d474-kube-api-access-x5pt7\") pod \"nova-cell1-conductor-0\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.439095 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.439251 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.439338 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5pt7\" (UniqueName: \"kubernetes.io/projected/dd462bb4-44f5-4e0f-bc17-53d24604d474-kube-api-access-x5pt7\") pod \"nova-cell1-conductor-0\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.444955 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.446296 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.459060 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5pt7\" (UniqueName: \"kubernetes.io/projected/dd462bb4-44f5-4e0f-bc17-53d24604d474-kube-api-access-x5pt7\") pod \"nova-cell1-conductor-0\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:47 crc kubenswrapper[5008]: I0318 18:25:47.753694 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:48 crc kubenswrapper[5008]: I0318 18:25:48.091446 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff482a80-2d35-4ec7-a524-67c8bbf33b5f","Type":"ContainerStarted","Data":"bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0"} Mar 18 18:25:48 crc kubenswrapper[5008]: I0318 18:25:48.113115 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.113095656 podStartE2EDuration="2.113095656s" podCreationTimestamp="2026-03-18 18:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:25:48.112898881 +0000 UTC m=+1404.632371990" watchObservedRunningTime="2026-03-18 18:25:48.113095656 +0000 UTC m=+1404.632568735" Mar 18 18:25:48 crc kubenswrapper[5008]: I0318 18:25:48.272401 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:25:48 crc kubenswrapper[5008]: E0318 18:25:48.496932 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a is running failed: container process not found" containerID="3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:25:48 crc kubenswrapper[5008]: E0318 18:25:48.497519 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a is running failed: container process not found" containerID="3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:25:48 crc kubenswrapper[5008]: E0318 18:25:48.497742 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a is running failed: container process not found" containerID="3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:25:48 crc kubenswrapper[5008]: E0318 18:25:48.497769 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9f9106d0-e19e-47c3-b7fb-8903eb6459ab" containerName="nova-scheduler-scheduler" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.028007 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.068540 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgvmd\" (UniqueName: \"kubernetes.io/projected/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-kube-api-access-wgvmd\") pod \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.068764 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-combined-ca-bundle\") pod \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.068908 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-config-data\") pod \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\" (UID: \"9f9106d0-e19e-47c3-b7fb-8903eb6459ab\") " Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.082986 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-kube-api-access-wgvmd" (OuterVolumeSpecName: "kube-api-access-wgvmd") pod "9f9106d0-e19e-47c3-b7fb-8903eb6459ab" (UID: "9f9106d0-e19e-47c3-b7fb-8903eb6459ab"). InnerVolumeSpecName "kube-api-access-wgvmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.112854 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f9106d0-e19e-47c3-b7fb-8903eb6459ab" (UID: "9f9106d0-e19e-47c3-b7fb-8903eb6459ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.126732 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-config-data" (OuterVolumeSpecName: "config-data") pod "9f9106d0-e19e-47c3-b7fb-8903eb6459ab" (UID: "9f9106d0-e19e-47c3-b7fb-8903eb6459ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.127631 5008 generic.go:334] "Generic (PLEG): container finished" podID="9f9106d0-e19e-47c3-b7fb-8903eb6459ab" containerID="3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a" exitCode=0 Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.127694 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f9106d0-e19e-47c3-b7fb-8903eb6459ab","Type":"ContainerDied","Data":"3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a"} Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.127720 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f9106d0-e19e-47c3-b7fb-8903eb6459ab","Type":"ContainerDied","Data":"05990e61f325f91f0e40df210dfb5ab7ba5d44c72be720f8bda6026fc4174941"} Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.127735 5008 scope.go:117] "RemoveContainer" containerID="3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.127728 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.131845 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd462bb4-44f5-4e0f-bc17-53d24604d474","Type":"ContainerStarted","Data":"efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a"} Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.131893 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd462bb4-44f5-4e0f-bc17-53d24604d474","Type":"ContainerStarted","Data":"a4872b08063355800bfb36664d6f95c7a3bbc1b3d95899e36c12794882bb383f"} Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.131951 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.149921 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.149903126 podStartE2EDuration="2.149903126s" podCreationTimestamp="2026-03-18 18:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:25:49.148982682 +0000 UTC m=+1405.668455761" watchObservedRunningTime="2026-03-18 18:25:49.149903126 +0000 UTC m=+1405.669376205" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.170860 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgvmd\" (UniqueName: \"kubernetes.io/projected/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-kube-api-access-wgvmd\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.170890 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.170905 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9106d0-e19e-47c3-b7fb-8903eb6459ab-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.176211 5008 scope.go:117] "RemoveContainer" containerID="3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a" Mar 18 18:25:49 crc kubenswrapper[5008]: E0318 18:25:49.177795 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a\": container with ID starting with 3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a not found: ID does not exist" containerID="3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.177844 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a"} err="failed to get container status \"3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a\": rpc error: code = NotFound desc = could not find container \"3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a\": container with ID starting with 3539d47dba7a77ca7bc6377b4aea2a3a0576ea81c4b4e0f368ea89b61993378a not found: ID does not exist" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.182524 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.197688 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.206351 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:49 crc kubenswrapper[5008]: E0318 18:25:49.206873 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9106d0-e19e-47c3-b7fb-8903eb6459ab" containerName="nova-scheduler-scheduler" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.206894 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9106d0-e19e-47c3-b7fb-8903eb6459ab" containerName="nova-scheduler-scheduler" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.207147 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9106d0-e19e-47c3-b7fb-8903eb6459ab" containerName="nova-scheduler-scheduler" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.207923 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.222002 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.235256 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.272020 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2nj\" (UniqueName: \"kubernetes.io/projected/34863810-2f18-4480-a22c-d4a953287b50-kube-api-access-rk2nj\") pod \"nova-scheduler-0\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.272070 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.272219 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-config-data\") pod \"nova-scheduler-0\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.373400 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-config-data\") pod \"nova-scheduler-0\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.373502 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2nj\" (UniqueName: \"kubernetes.io/projected/34863810-2f18-4480-a22c-d4a953287b50-kube-api-access-rk2nj\") pod \"nova-scheduler-0\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.373532 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.377725 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-config-data\") pod \"nova-scheduler-0\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.380018 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.388628 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2nj\" (UniqueName: \"kubernetes.io/projected/34863810-2f18-4480-a22c-d4a953287b50-kube-api-access-rk2nj\") pod \"nova-scheduler-0\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.540196 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.837616 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.983539 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-config-data\") pod \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.983633 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-897b5\" (UniqueName: \"kubernetes.io/projected/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-kube-api-access-897b5\") pod \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.983683 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-logs\") pod \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.983725 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-combined-ca-bundle\") pod \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\" (UID: \"b7714967-8c6f-4eac-8e87-6eb2c1cb754c\") " Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.984230 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-logs" (OuterVolumeSpecName: "logs") pod "b7714967-8c6f-4eac-8e87-6eb2c1cb754c" (UID: "b7714967-8c6f-4eac-8e87-6eb2c1cb754c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.984575 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:49 crc kubenswrapper[5008]: I0318 18:25:49.988236 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-kube-api-access-897b5" (OuterVolumeSpecName: "kube-api-access-897b5") pod "b7714967-8c6f-4eac-8e87-6eb2c1cb754c" (UID: "b7714967-8c6f-4eac-8e87-6eb2c1cb754c"). InnerVolumeSpecName "kube-api-access-897b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.008439 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7714967-8c6f-4eac-8e87-6eb2c1cb754c" (UID: "b7714967-8c6f-4eac-8e87-6eb2c1cb754c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.014819 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-config-data" (OuterVolumeSpecName: "config-data") pod "b7714967-8c6f-4eac-8e87-6eb2c1cb754c" (UID: "b7714967-8c6f-4eac-8e87-6eb2c1cb754c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.048692 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:25:50 crc kubenswrapper[5008]: W0318 18:25:50.052269 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34863810_2f18_4480_a22c_d4a953287b50.slice/crio-96096ace69f382aba16f4e474c79515fd73616426d1728118c9a2a887a43e3f2 WatchSource:0}: Error finding container 96096ace69f382aba16f4e474c79515fd73616426d1728118c9a2a887a43e3f2: Status 404 returned error can't find the container with id 96096ace69f382aba16f4e474c79515fd73616426d1728118c9a2a887a43e3f2 Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.086488 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.086519 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-897b5\" (UniqueName: \"kubernetes.io/projected/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-kube-api-access-897b5\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.086529 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7714967-8c6f-4eac-8e87-6eb2c1cb754c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.144133 5008 generic.go:334] "Generic (PLEG): container finished" podID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerID="b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8" exitCode=0 Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.144221 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7714967-8c6f-4eac-8e87-6eb2c1cb754c","Type":"ContainerDied","Data":"b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8"} Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.144259 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7714967-8c6f-4eac-8e87-6eb2c1cb754c","Type":"ContainerDied","Data":"9d2fe1e22f9dd909f3a53dbcc7664e9d830766abef70e7355363974ad07e2672"} Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.144282 5008 scope.go:117] "RemoveContainer" containerID="b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.144457 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.148450 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"34863810-2f18-4480-a22c-d4a953287b50","Type":"ContainerStarted","Data":"96096ace69f382aba16f4e474c79515fd73616426d1728118c9a2a887a43e3f2"} Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.175389 5008 scope.go:117] "RemoveContainer" containerID="f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.187700 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.209519 5008 scope.go:117] "RemoveContainer" containerID="b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8" Mar 18 18:25:50 crc kubenswrapper[5008]: E0318 18:25:50.210739 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8\": container with ID starting with b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8 not found: ID does not exist" containerID="b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.210776 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8"} err="failed to get container status \"b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8\": rpc error: code = NotFound desc = could not find container \"b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8\": container with ID starting with b732b6502fad5d9dd985aba72f2285f3e19b23d0713bcb80c127c8a5a313a2d8 not found: ID does not exist" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.210800 5008 scope.go:117] "RemoveContainer" containerID="f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304" Mar 18 18:25:50 crc kubenswrapper[5008]: E0318 18:25:50.213146 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304\": container with ID starting with f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304 not found: ID does not exist" containerID="f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.213185 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304"} err="failed to get container status \"f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304\": rpc error: code = NotFound desc = could not find container \"f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304\": container with ID starting with f557c6a8ce1ba2781fcef8df9f5cf23382ccb3e15dc9bc4306d368be51d91304 not found: ID does not exist" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.214915 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9106d0-e19e-47c3-b7fb-8903eb6459ab" path="/var/lib/kubelet/pods/9f9106d0-e19e-47c3-b7fb-8903eb6459ab/volumes" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.215457 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.215488 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:50 crc kubenswrapper[5008]: E0318 18:25:50.215825 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-log" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.215852 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-log" Mar 18 18:25:50 crc kubenswrapper[5008]: E0318 18:25:50.215883 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-api" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.215894 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-api" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.216094 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-api" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.216126 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" containerName="nova-api-log" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.221920 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.224529 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.250428 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.394027 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhxtb\" (UniqueName: \"kubernetes.io/projected/cdf2f634-339e-4f92-8046-2786b3e31caf-kube-api-access-hhxtb\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.394112 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf2f634-339e-4f92-8046-2786b3e31caf-logs\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.394189 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.394236 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-config-data\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.495811 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.496251 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-config-data\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.496355 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhxtb\" (UniqueName: \"kubernetes.io/projected/cdf2f634-339e-4f92-8046-2786b3e31caf-kube-api-access-hhxtb\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.496402 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf2f634-339e-4f92-8046-2786b3e31caf-logs\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.497056 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf2f634-339e-4f92-8046-2786b3e31caf-logs\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.502722 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.503317 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-config-data\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.516038 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhxtb\" (UniqueName: \"kubernetes.io/projected/cdf2f634-339e-4f92-8046-2786b3e31caf-kube-api-access-hhxtb\") pod \"nova-api-0\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " pod="openstack/nova-api-0" Mar 18 18:25:50 crc kubenswrapper[5008]: I0318 18:25:50.592478 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:25:51 crc kubenswrapper[5008]: I0318 18:25:51.075347 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:25:51 crc kubenswrapper[5008]: I0318 18:25:51.158841 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdf2f634-339e-4f92-8046-2786b3e31caf","Type":"ContainerStarted","Data":"dce5a43499461ffacf95b2032d1d6b0b8034a9a1429756ef2791d656d2fcd38e"} Mar 18 18:25:51 crc kubenswrapper[5008]: I0318 18:25:51.160967 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"34863810-2f18-4480-a22c-d4a953287b50","Type":"ContainerStarted","Data":"59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914"} Mar 18 18:25:51 crc kubenswrapper[5008]: I0318 18:25:51.184780 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.184761376 podStartE2EDuration="2.184761376s" podCreationTimestamp="2026-03-18 18:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:25:51.182143967 +0000 UTC m=+1407.701617086" watchObservedRunningTime="2026-03-18 18:25:51.184761376 +0000 UTC m=+1407.704234455" Mar 18 18:25:52 crc kubenswrapper[5008]: I0318 18:25:52.175168 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdf2f634-339e-4f92-8046-2786b3e31caf","Type":"ContainerStarted","Data":"f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2"} Mar 18 18:25:52 crc kubenswrapper[5008]: I0318 18:25:52.175893 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdf2f634-339e-4f92-8046-2786b3e31caf","Type":"ContainerStarted","Data":"a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6"} Mar 18 18:25:52 crc kubenswrapper[5008]: I0318 18:25:52.201467 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.201441047 podStartE2EDuration="2.201441047s" podCreationTimestamp="2026-03-18 18:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:25:52.196463426 +0000 UTC m=+1408.715936525" watchObservedRunningTime="2026-03-18 18:25:52.201441047 +0000 UTC m=+1408.720914146" Mar 18 18:25:52 crc kubenswrapper[5008]: I0318 18:25:52.220137 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7714967-8c6f-4eac-8e87-6eb2c1cb754c" path="/var/lib/kubelet/pods/b7714967-8c6f-4eac-8e87-6eb2c1cb754c/volumes" Mar 18 18:25:54 crc kubenswrapper[5008]: I0318 18:25:54.541514 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 18:25:56 crc kubenswrapper[5008]: I0318 18:25:56.487660 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:25:56 crc kubenswrapper[5008]: I0318 18:25:56.487701 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:25:57 crc kubenswrapper[5008]: I0318 18:25:57.507727 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:25:57 crc kubenswrapper[5008]: I0318 18:25:57.507744 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:25:57 crc kubenswrapper[5008]: I0318 18:25:57.790154 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 18:25:59 crc kubenswrapper[5008]: I0318 18:25:59.540911 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 18:25:59 crc kubenswrapper[5008]: I0318 18:25:59.571324 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.155997 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564306-zjwp8"] Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.158077 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-zjwp8" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.160936 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.163247 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.163456 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.176594 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-zjwp8"] Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.279524 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d9rd\" (UniqueName: \"kubernetes.io/projected/26e86e59-8c2c-45e0-b661-4e9e815e138b-kube-api-access-8d9rd\") pod \"auto-csr-approver-29564306-zjwp8\" (UID: \"26e86e59-8c2c-45e0-b661-4e9e815e138b\") " pod="openshift-infra/auto-csr-approver-29564306-zjwp8" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.328393 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.382147 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d9rd\" (UniqueName: \"kubernetes.io/projected/26e86e59-8c2c-45e0-b661-4e9e815e138b-kube-api-access-8d9rd\") pod \"auto-csr-approver-29564306-zjwp8\" (UID: \"26e86e59-8c2c-45e0-b661-4e9e815e138b\") " pod="openshift-infra/auto-csr-approver-29564306-zjwp8" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.405103 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d9rd\" (UniqueName: \"kubernetes.io/projected/26e86e59-8c2c-45e0-b661-4e9e815e138b-kube-api-access-8d9rd\") pod \"auto-csr-approver-29564306-zjwp8\" (UID: \"26e86e59-8c2c-45e0-b661-4e9e815e138b\") " pod="openshift-infra/auto-csr-approver-29564306-zjwp8" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.500120 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-zjwp8" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.593196 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.593250 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:26:00 crc kubenswrapper[5008]: I0318 18:26:00.964840 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-zjwp8"] Mar 18 18:26:00 crc kubenswrapper[5008]: W0318 18:26:00.974285 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26e86e59_8c2c_45e0_b661_4e9e815e138b.slice/crio-a186a88344365ef6eb13cca209cbbed0a51cf9f4a5ac4bdf3179b3b58b48ba73 WatchSource:0}: Error finding container a186a88344365ef6eb13cca209cbbed0a51cf9f4a5ac4bdf3179b3b58b48ba73: Status 404 returned error can't find the container with id a186a88344365ef6eb13cca209cbbed0a51cf9f4a5ac4bdf3179b3b58b48ba73 Mar 18 18:26:01 crc kubenswrapper[5008]: I0318 18:26:01.284022 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-zjwp8" event={"ID":"26e86e59-8c2c-45e0-b661-4e9e815e138b","Type":"ContainerStarted","Data":"a186a88344365ef6eb13cca209cbbed0a51cf9f4a5ac4bdf3179b3b58b48ba73"} Mar 18 18:26:01 crc kubenswrapper[5008]: I0318 18:26:01.675869 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:26:01 crc kubenswrapper[5008]: I0318 18:26:01.675911 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 18:26:04 crc kubenswrapper[5008]: I0318 18:26:04.486878 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:26:04 crc kubenswrapper[5008]: I0318 18:26:04.487236 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:26:05 crc kubenswrapper[5008]: I0318 18:26:05.337202 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-zjwp8" event={"ID":"26e86e59-8c2c-45e0-b661-4e9e815e138b","Type":"ContainerStarted","Data":"2e5a59c9d7fb057e0e04ebc89497f693bac1a00de19d2d85b96fb8e2a863fd3b"} Mar 18 18:26:05 crc kubenswrapper[5008]: I0318 18:26:05.355114 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564306-zjwp8" podStartSLOduration=1.416529489 podStartE2EDuration="5.355092271s" podCreationTimestamp="2026-03-18 18:26:00 +0000 UTC" firstStartedPulling="2026-03-18 18:26:00.977806104 +0000 UTC m=+1417.497279193" lastFinishedPulling="2026-03-18 18:26:04.916368856 +0000 UTC m=+1421.435841975" observedRunningTime="2026-03-18 18:26:05.351680822 +0000 UTC m=+1421.871153931" watchObservedRunningTime="2026-03-18 18:26:05.355092271 +0000 UTC m=+1421.874565370" Mar 18 18:26:06 crc kubenswrapper[5008]: I0318 18:26:06.345651 5008 generic.go:334] "Generic (PLEG): container finished" podID="26e86e59-8c2c-45e0-b661-4e9e815e138b" containerID="2e5a59c9d7fb057e0e04ebc89497f693bac1a00de19d2d85b96fb8e2a863fd3b" exitCode=0 Mar 18 18:26:06 crc kubenswrapper[5008]: I0318 18:26:06.345784 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-zjwp8" event={"ID":"26e86e59-8c2c-45e0-b661-4e9e815e138b","Type":"ContainerDied","Data":"2e5a59c9d7fb057e0e04ebc89497f693bac1a00de19d2d85b96fb8e2a863fd3b"} Mar 18 18:26:06 crc kubenswrapper[5008]: I0318 18:26:06.497183 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:26:06 crc kubenswrapper[5008]: I0318 18:26:06.504575 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:26:06 crc kubenswrapper[5008]: I0318 18:26:06.507711 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:26:07 crc kubenswrapper[5008]: I0318 18:26:07.364188 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:26:07 crc kubenswrapper[5008]: I0318 18:26:07.821702 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-zjwp8" Mar 18 18:26:07 crc kubenswrapper[5008]: I0318 18:26:07.932228 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d9rd\" (UniqueName: \"kubernetes.io/projected/26e86e59-8c2c-45e0-b661-4e9e815e138b-kube-api-access-8d9rd\") pod \"26e86e59-8c2c-45e0-b661-4e9e815e138b\" (UID: \"26e86e59-8c2c-45e0-b661-4e9e815e138b\") " Mar 18 18:26:07 crc kubenswrapper[5008]: I0318 18:26:07.941049 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e86e59-8c2c-45e0-b661-4e9e815e138b-kube-api-access-8d9rd" (OuterVolumeSpecName: "kube-api-access-8d9rd") pod "26e86e59-8c2c-45e0-b661-4e9e815e138b" (UID: "26e86e59-8c2c-45e0-b661-4e9e815e138b"). InnerVolumeSpecName "kube-api-access-8d9rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.035072 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d9rd\" (UniqueName: \"kubernetes.io/projected/26e86e59-8c2c-45e0-b661-4e9e815e138b-kube-api-access-8d9rd\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.263292 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.367655 5008 generic.go:334] "Generic (PLEG): container finished" podID="28217cfa-14bc-4fef-bea3-f1ea3a446ee5" containerID="59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680" exitCode=137 Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.367709 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"28217cfa-14bc-4fef-bea3-f1ea3a446ee5","Type":"ContainerDied","Data":"59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680"} Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.367732 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"28217cfa-14bc-4fef-bea3-f1ea3a446ee5","Type":"ContainerDied","Data":"33ac2328f32e1cb4d3136f65f38fda1919f7cef8a7bfec58d1b71f6a7a086a2f"} Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.367749 5008 scope.go:117] "RemoveContainer" containerID="59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.367838 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.372335 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564306-zjwp8" event={"ID":"26e86e59-8c2c-45e0-b661-4e9e815e138b","Type":"ContainerDied","Data":"a186a88344365ef6eb13cca209cbbed0a51cf9f4a5ac4bdf3179b3b58b48ba73"} Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.372359 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564306-zjwp8" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.372380 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a186a88344365ef6eb13cca209cbbed0a51cf9f4a5ac4bdf3179b3b58b48ba73" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.406804 5008 scope.go:117] "RemoveContainer" containerID="59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680" Mar 18 18:26:08 crc kubenswrapper[5008]: E0318 18:26:08.407227 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680\": container with ID starting with 59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680 not found: ID does not exist" containerID="59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.407273 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680"} err="failed to get container status \"59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680\": rpc error: code = NotFound desc = could not find container \"59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680\": container with ID starting with 59627fdb11217c45e0ed161286904dc1f4203c7e3e74b368f09133d93ec26680 not found: ID does not exist" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.433966 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-58hhx"] Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.441201 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs9lx\" (UniqueName: \"kubernetes.io/projected/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-kube-api-access-rs9lx\") pod \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.441296 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-config-data\") pod \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.441395 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-combined-ca-bundle\") pod \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\" (UID: \"28217cfa-14bc-4fef-bea3-f1ea3a446ee5\") " Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.441697 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564300-58hhx"] Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.445147 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.445897 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-kube-api-access-rs9lx" (OuterVolumeSpecName: "kube-api-access-rs9lx") pod "28217cfa-14bc-4fef-bea3-f1ea3a446ee5" (UID: "28217cfa-14bc-4fef-bea3-f1ea3a446ee5"). InnerVolumeSpecName "kube-api-access-rs9lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.472444 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-config-data" (OuterVolumeSpecName: "config-data") pod "28217cfa-14bc-4fef-bea3-f1ea3a446ee5" (UID: "28217cfa-14bc-4fef-bea3-f1ea3a446ee5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.472546 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28217cfa-14bc-4fef-bea3-f1ea3a446ee5" (UID: "28217cfa-14bc-4fef-bea3-f1ea3a446ee5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.543891 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.543927 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs9lx\" (UniqueName: \"kubernetes.io/projected/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-kube-api-access-rs9lx\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.543939 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28217cfa-14bc-4fef-bea3-f1ea3a446ee5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.592804 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.592865 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.699721 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.710164 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.763037 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:26:08 crc kubenswrapper[5008]: E0318 18:26:08.764023 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28217cfa-14bc-4fef-bea3-f1ea3a446ee5" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.764049 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="28217cfa-14bc-4fef-bea3-f1ea3a446ee5" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:26:08 crc kubenswrapper[5008]: E0318 18:26:08.764125 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e86e59-8c2c-45e0-b661-4e9e815e138b" containerName="oc" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.764137 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e86e59-8c2c-45e0-b661-4e9e815e138b" containerName="oc" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.764590 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="28217cfa-14bc-4fef-bea3-f1ea3a446ee5" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.764644 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e86e59-8c2c-45e0-b661-4e9e815e138b" containerName="oc" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.765737 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.771263 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.771539 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.773927 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.781165 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.850902 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.851079 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfx95\" (UniqueName: \"kubernetes.io/projected/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-kube-api-access-hfx95\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.851153 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.851226 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.851251 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.952575 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.952714 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.952751 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfx95\" (UniqueName: \"kubernetes.io/projected/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-kube-api-access-hfx95\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.952803 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.952848 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.956666 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.956792 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.956866 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.957037 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:08 crc kubenswrapper[5008]: I0318 18:26:08.968214 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfx95\" (UniqueName: \"kubernetes.io/projected/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-kube-api-access-hfx95\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:09 crc kubenswrapper[5008]: I0318 18:26:09.105191 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:09 crc kubenswrapper[5008]: I0318 18:26:09.520898 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:26:10 crc kubenswrapper[5008]: I0318 18:26:10.213598 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28217cfa-14bc-4fef-bea3-f1ea3a446ee5" path="/var/lib/kubelet/pods/28217cfa-14bc-4fef-bea3-f1ea3a446ee5/volumes" Mar 18 18:26:10 crc kubenswrapper[5008]: I0318 18:26:10.214744 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd02a294-4a36-457a-9573-6cb6d07c841b" path="/var/lib/kubelet/pods/dd02a294-4a36-457a-9573-6cb6d07c841b/volumes" Mar 18 18:26:10 crc kubenswrapper[5008]: I0318 18:26:10.395704 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7ab5f625-144a-4c7c-bab8-5399de3b5a8e","Type":"ContainerStarted","Data":"c2dddb0fc09ab3bed8b4a15346419bed092d4dcea57bf35b28cdac3523e7802a"} Mar 18 18:26:10 crc kubenswrapper[5008]: I0318 18:26:10.395745 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7ab5f625-144a-4c7c-bab8-5399de3b5a8e","Type":"ContainerStarted","Data":"8b2ae319823e9e3c57fabe7c051b54b23f6b58f3454be8120ede9b39f41397d6"} Mar 18 18:26:10 crc kubenswrapper[5008]: I0318 18:26:10.415617 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.415590637 podStartE2EDuration="2.415590637s" podCreationTimestamp="2026-03-18 18:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:26:10.409142416 +0000 UTC m=+1426.928615505" watchObservedRunningTime="2026-03-18 18:26:10.415590637 +0000 UTC m=+1426.935063726" Mar 18 18:26:10 crc kubenswrapper[5008]: I0318 18:26:10.614937 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:26:10 crc kubenswrapper[5008]: I0318 18:26:10.616375 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:26:10 crc kubenswrapper[5008]: I0318 18:26:10.617972 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.407541 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.584032 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-gfwz9"] Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.586161 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.600712 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-gfwz9"] Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.702042 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-swift-storage-0\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.702091 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-svc\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.702117 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.702180 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-config\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.702203 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.702218 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrc8c\" (UniqueName: \"kubernetes.io/projected/68b393c9-78fb-4bde-930d-6af4b840f9e3-kube-api-access-rrc8c\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.804379 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-svc\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.804437 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.804473 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-config\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.804495 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.804515 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrc8c\" (UniqueName: \"kubernetes.io/projected/68b393c9-78fb-4bde-930d-6af4b840f9e3-kube-api-access-rrc8c\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.804661 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-swift-storage-0\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.805433 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-swift-storage-0\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.805507 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-svc\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.805511 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.805768 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.805888 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-config\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.841519 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrc8c\" (UniqueName: \"kubernetes.io/projected/68b393c9-78fb-4bde-930d-6af4b840f9e3-kube-api-access-rrc8c\") pod \"dnsmasq-dns-6bd85b459c-gfwz9\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.863827 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cc95l"] Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.913113 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.914524 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:11 crc kubenswrapper[5008]: I0318 18:26:11.943700 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cc95l"] Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.011433 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rxm4\" (UniqueName: \"kubernetes.io/projected/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-kube-api-access-6rxm4\") pod \"community-operators-cc95l\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.011851 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-utilities\") pod \"community-operators-cc95l\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.011983 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-catalog-content\") pod \"community-operators-cc95l\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.114069 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rxm4\" (UniqueName: \"kubernetes.io/projected/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-kube-api-access-6rxm4\") pod \"community-operators-cc95l\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.114160 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-utilities\") pod \"community-operators-cc95l\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.114177 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-catalog-content\") pod \"community-operators-cc95l\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.114772 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-catalog-content\") pod \"community-operators-cc95l\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.117329 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-utilities\") pod \"community-operators-cc95l\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.136212 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rxm4\" (UniqueName: \"kubernetes.io/projected/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-kube-api-access-6rxm4\") pod \"community-operators-cc95l\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:12 crc kubenswrapper[5008]: I0318 18:26:12.250010 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:13 crc kubenswrapper[5008]: I0318 18:26:12.498709 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-gfwz9"] Mar 18 18:26:13 crc kubenswrapper[5008]: I0318 18:26:13.412607 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cc95l"] Mar 18 18:26:13 crc kubenswrapper[5008]: I0318 18:26:13.439483 5008 generic.go:334] "Generic (PLEG): container finished" podID="68b393c9-78fb-4bde-930d-6af4b840f9e3" containerID="aacd5171f8a4d4ac3c147fedb821e1e7fd55e12771e32ff44716b52bc9ceb37d" exitCode=0 Mar 18 18:26:13 crc kubenswrapper[5008]: I0318 18:26:13.439656 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" event={"ID":"68b393c9-78fb-4bde-930d-6af4b840f9e3","Type":"ContainerDied","Data":"aacd5171f8a4d4ac3c147fedb821e1e7fd55e12771e32ff44716b52bc9ceb37d"} Mar 18 18:26:13 crc kubenswrapper[5008]: I0318 18:26:13.439686 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" event={"ID":"68b393c9-78fb-4bde-930d-6af4b840f9e3","Type":"ContainerStarted","Data":"9122f6f26080b81ea5afa92af72ef472499bf65dadd0621b6646e545ce98fe62"} Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.105748 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.250894 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5bmh8"] Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.253296 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.272817 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5bmh8"] Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.366457 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.371757 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jncsl\" (UniqueName: \"kubernetes.io/projected/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-kube-api-access-jncsl\") pod \"redhat-operators-5bmh8\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.371897 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-catalog-content\") pod \"redhat-operators-5bmh8\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.371962 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-utilities\") pod \"redhat-operators-5bmh8\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.450111 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" event={"ID":"68b393c9-78fb-4bde-930d-6af4b840f9e3","Type":"ContainerStarted","Data":"22db4948be656e49590d67a4b03650ce23c340a3e654f974e8ccf95e3e514659"} Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.450234 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.451937 5008 generic.go:334] "Generic (PLEG): container finished" podID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerID="f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158" exitCode=0 Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.452055 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cc95l" event={"ID":"2716be49-c7ba-47b8-8c63-4e5d9bd7156c","Type":"ContainerDied","Data":"f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158"} Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.452094 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cc95l" event={"ID":"2716be49-c7ba-47b8-8c63-4e5d9bd7156c","Type":"ContainerStarted","Data":"61dd1ff78cded048717ebec85dbe465d58bbd5bc16910835ce193ef2b552a985"} Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.452114 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-log" containerID="cri-o://a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6" gracePeriod=30 Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.452157 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-api" containerID="cri-o://f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2" gracePeriod=30 Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.473133 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-catalog-content\") pod \"redhat-operators-5bmh8\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.473219 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-utilities\") pod \"redhat-operators-5bmh8\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.473274 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jncsl\" (UniqueName: \"kubernetes.io/projected/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-kube-api-access-jncsl\") pod \"redhat-operators-5bmh8\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.473912 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-catalog-content\") pod \"redhat-operators-5bmh8\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.474128 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-utilities\") pod \"redhat-operators-5bmh8\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.476924 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" podStartSLOduration=3.476905318 podStartE2EDuration="3.476905318s" podCreationTimestamp="2026-03-18 18:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:26:14.469033509 +0000 UTC m=+1430.988506588" watchObservedRunningTime="2026-03-18 18:26:14.476905318 +0000 UTC m=+1430.996378397" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.501515 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jncsl\" (UniqueName: \"kubernetes.io/projected/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-kube-api-access-jncsl\") pod \"redhat-operators-5bmh8\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:14 crc kubenswrapper[5008]: I0318 18:26:14.606391 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.106427 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5bmh8"] Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.208350 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.208623 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="ceilometer-central-agent" containerID="cri-o://05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7" gracePeriod=30 Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.208704 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="proxy-httpd" containerID="cri-o://c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37" gracePeriod=30 Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.208763 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="ceilometer-notification-agent" containerID="cri-o://c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676" gracePeriod=30 Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.208969 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="sg-core" containerID="cri-o://bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583" gracePeriod=30 Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.475125 5008 generic.go:334] "Generic (PLEG): container finished" podID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerID="c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37" exitCode=0 Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.475160 5008 generic.go:334] "Generic (PLEG): container finished" podID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerID="bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583" exitCode=2 Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.475211 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerDied","Data":"c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37"} Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.475241 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerDied","Data":"bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583"} Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.480729 5008 generic.go:334] "Generic (PLEG): container finished" podID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerID="a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6" exitCode=143 Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.480776 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdf2f634-339e-4f92-8046-2786b3e31caf","Type":"ContainerDied","Data":"a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6"} Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.486828 5008 generic.go:334] "Generic (PLEG): container finished" podID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerID="b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391" exitCode=0 Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.486909 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bmh8" event={"ID":"d9fa9d31-a966-44fc-a326-66ed75f7d7bc","Type":"ContainerDied","Data":"b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391"} Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.486943 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bmh8" event={"ID":"d9fa9d31-a966-44fc-a326-66ed75f7d7bc","Type":"ContainerStarted","Data":"4a62c8e978e0f88ecd969d80f1dd20cd2532c82d3d6c31718ef6486921e17402"} Mar 18 18:26:15 crc kubenswrapper[5008]: I0318 18:26:15.494178 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cc95l" event={"ID":"2716be49-c7ba-47b8-8c63-4e5d9bd7156c","Type":"ContainerStarted","Data":"a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977"} Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.239978 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.315660 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqmph\" (UniqueName: \"kubernetes.io/projected/719c54e1-6e5f-4769-94b0-c3f329fcf966-kube-api-access-tqmph\") pod \"719c54e1-6e5f-4769-94b0-c3f329fcf966\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.315787 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-scripts\") pod \"719c54e1-6e5f-4769-94b0-c3f329fcf966\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.315840 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-run-httpd\") pod \"719c54e1-6e5f-4769-94b0-c3f329fcf966\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.315885 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-config-data\") pod \"719c54e1-6e5f-4769-94b0-c3f329fcf966\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.315900 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-sg-core-conf-yaml\") pod \"719c54e1-6e5f-4769-94b0-c3f329fcf966\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.315929 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-combined-ca-bundle\") pod \"719c54e1-6e5f-4769-94b0-c3f329fcf966\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.316777 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-log-httpd\") pod \"719c54e1-6e5f-4769-94b0-c3f329fcf966\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.316846 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-ceilometer-tls-certs\") pod \"719c54e1-6e5f-4769-94b0-c3f329fcf966\" (UID: \"719c54e1-6e5f-4769-94b0-c3f329fcf966\") " Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.318072 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "719c54e1-6e5f-4769-94b0-c3f329fcf966" (UID: "719c54e1-6e5f-4769-94b0-c3f329fcf966"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.318362 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "719c54e1-6e5f-4769-94b0-c3f329fcf966" (UID: "719c54e1-6e5f-4769-94b0-c3f329fcf966"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.341737 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719c54e1-6e5f-4769-94b0-c3f329fcf966-kube-api-access-tqmph" (OuterVolumeSpecName: "kube-api-access-tqmph") pod "719c54e1-6e5f-4769-94b0-c3f329fcf966" (UID: "719c54e1-6e5f-4769-94b0-c3f329fcf966"). InnerVolumeSpecName "kube-api-access-tqmph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.345670 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-scripts" (OuterVolumeSpecName: "scripts") pod "719c54e1-6e5f-4769-94b0-c3f329fcf966" (UID: "719c54e1-6e5f-4769-94b0-c3f329fcf966"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.399227 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "719c54e1-6e5f-4769-94b0-c3f329fcf966" (UID: "719c54e1-6e5f-4769-94b0-c3f329fcf966"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.418788 5008 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.419061 5008 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.419070 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqmph\" (UniqueName: \"kubernetes.io/projected/719c54e1-6e5f-4769-94b0-c3f329fcf966-kube-api-access-tqmph\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.419080 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.419088 5008 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/719c54e1-6e5f-4769-94b0-c3f329fcf966-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.434488 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "719c54e1-6e5f-4769-94b0-c3f329fcf966" (UID: "719c54e1-6e5f-4769-94b0-c3f329fcf966"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.447470 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "719c54e1-6e5f-4769-94b0-c3f329fcf966" (UID: "719c54e1-6e5f-4769-94b0-c3f329fcf966"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.509744 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-config-data" (OuterVolumeSpecName: "config-data") pod "719c54e1-6e5f-4769-94b0-c3f329fcf966" (UID: "719c54e1-6e5f-4769-94b0-c3f329fcf966"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.517740 5008 generic.go:334] "Generic (PLEG): container finished" podID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerID="a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977" exitCode=0 Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.517806 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cc95l" event={"ID":"2716be49-c7ba-47b8-8c63-4e5d9bd7156c","Type":"ContainerDied","Data":"a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977"} Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.520213 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.520235 5008 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.520250 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719c54e1-6e5f-4769-94b0-c3f329fcf966-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.525269 5008 generic.go:334] "Generic (PLEG): container finished" podID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerID="c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676" exitCode=0 Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.525311 5008 generic.go:334] "Generic (PLEG): container finished" podID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerID="05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7" exitCode=0 Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.525332 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerDied","Data":"c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676"} Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.525356 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerDied","Data":"05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7"} Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.525406 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"719c54e1-6e5f-4769-94b0-c3f329fcf966","Type":"ContainerDied","Data":"7ea9ee2a8ab3901a9fb2bfa9c46c452c9801b18a71c52e8756907367866229d7"} Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.525422 5008 scope.go:117] "RemoveContainer" containerID="c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.525601 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.612254 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.613858 5008 scope.go:117] "RemoveContainer" containerID="bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.637303 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.639682 5008 scope.go:117] "RemoveContainer" containerID="c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.647445 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:16 crc kubenswrapper[5008]: E0318 18:26:16.647868 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="proxy-httpd" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.647885 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="proxy-httpd" Mar 18 18:26:16 crc kubenswrapper[5008]: E0318 18:26:16.647906 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="ceilometer-central-agent" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.647913 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="ceilometer-central-agent" Mar 18 18:26:16 crc kubenswrapper[5008]: E0318 18:26:16.647925 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="sg-core" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.647931 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="sg-core" Mar 18 18:26:16 crc kubenswrapper[5008]: E0318 18:26:16.647950 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="ceilometer-notification-agent" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.647957 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="ceilometer-notification-agent" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.648134 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="ceilometer-central-agent" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.648153 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="proxy-httpd" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.648168 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="sg-core" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.648180 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" containerName="ceilometer-notification-agent" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.653789 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.664210 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.664446 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.668492 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.684750 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.692210 5008 scope.go:117] "RemoveContainer" containerID="05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.723226 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.723314 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-run-httpd\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.723374 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-log-httpd\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.723419 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-scripts\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.723493 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c964m\" (UniqueName: \"kubernetes.io/projected/120d7db6-88e7-4edf-83bf-ba1c59004db2-kube-api-access-c964m\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.723567 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.723595 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-config-data\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.723666 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.729307 5008 scope.go:117] "RemoveContainer" containerID="c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37" Mar 18 18:26:16 crc kubenswrapper[5008]: E0318 18:26:16.730250 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37\": container with ID starting with c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37 not found: ID does not exist" containerID="c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.730358 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37"} err="failed to get container status \"c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37\": rpc error: code = NotFound desc = could not find container \"c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37\": container with ID starting with c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37 not found: ID does not exist" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.730388 5008 scope.go:117] "RemoveContainer" containerID="bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583" Mar 18 18:26:16 crc kubenswrapper[5008]: E0318 18:26:16.730721 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583\": container with ID starting with bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583 not found: ID does not exist" containerID="bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.730754 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583"} err="failed to get container status \"bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583\": rpc error: code = NotFound desc = could not find container \"bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583\": container with ID starting with bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583 not found: ID does not exist" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.730782 5008 scope.go:117] "RemoveContainer" containerID="c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676" Mar 18 18:26:16 crc kubenswrapper[5008]: E0318 18:26:16.731254 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676\": container with ID starting with c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676 not found: ID does not exist" containerID="c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.731279 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676"} err="failed to get container status \"c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676\": rpc error: code = NotFound desc = could not find container \"c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676\": container with ID starting with c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676 not found: ID does not exist" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.731293 5008 scope.go:117] "RemoveContainer" containerID="05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7" Mar 18 18:26:16 crc kubenswrapper[5008]: E0318 18:26:16.731599 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7\": container with ID starting with 05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7 not found: ID does not exist" containerID="05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.731623 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7"} err="failed to get container status \"05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7\": rpc error: code = NotFound desc = could not find container \"05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7\": container with ID starting with 05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7 not found: ID does not exist" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.731641 5008 scope.go:117] "RemoveContainer" containerID="c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.731910 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37"} err="failed to get container status \"c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37\": rpc error: code = NotFound desc = could not find container \"c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37\": container with ID starting with c6fd87988390e3e53afa3b8fdfdc58229df833db2b97b63044fa3c5479ac7a37 not found: ID does not exist" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.731932 5008 scope.go:117] "RemoveContainer" containerID="bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.732188 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583"} err="failed to get container status \"bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583\": rpc error: code = NotFound desc = could not find container \"bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583\": container with ID starting with bb996ae35a87a77f116d672cba988a2bb62a2d44dcfb81cf8080b4c81757d583 not found: ID does not exist" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.732217 5008 scope.go:117] "RemoveContainer" containerID="c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.732481 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676"} err="failed to get container status \"c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676\": rpc error: code = NotFound desc = could not find container \"c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676\": container with ID starting with c60cf5a4b5830bf0294be2b25ddb5c7f26242f5aa48abc1abedc1a1615b3b676 not found: ID does not exist" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.732502 5008 scope.go:117] "RemoveContainer" containerID="05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.732735 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7"} err="failed to get container status \"05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7\": rpc error: code = NotFound desc = could not find container \"05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7\": container with ID starting with 05c86c0f576235c4f8d7440ee7e61a350106077ad4cff436b7fbc4a123ebf6d7 not found: ID does not exist" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.825797 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c964m\" (UniqueName: \"kubernetes.io/projected/120d7db6-88e7-4edf-83bf-ba1c59004db2-kube-api-access-c964m\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.826175 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.826203 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-config-data\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.826245 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.826320 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.826896 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-run-httpd\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.832898 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.826362 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-run-httpd\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.835934 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-log-httpd\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.835968 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-scripts\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.837392 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-config-data\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.837954 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-log-httpd\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.840078 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.841520 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-scripts\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.842782 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.845113 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c964m\" (UniqueName: \"kubernetes.io/projected/120d7db6-88e7-4edf-83bf-ba1c59004db2-kube-api-access-c964m\") pod \"ceilometer-0\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " pod="openstack/ceilometer-0" Mar 18 18:26:16 crc kubenswrapper[5008]: I0318 18:26:16.990465 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:26:17 crc kubenswrapper[5008]: I0318 18:26:17.467694 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:17 crc kubenswrapper[5008]: I0318 18:26:17.539277 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerStarted","Data":"6f7c7e521656e87e8d75593ee7067d79013bde958c97425cdd1ed4d205d765a0"} Mar 18 18:26:17 crc kubenswrapper[5008]: I0318 18:26:17.541294 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bmh8" event={"ID":"d9fa9d31-a966-44fc-a326-66ed75f7d7bc","Type":"ContainerStarted","Data":"3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d"} Mar 18 18:26:17 crc kubenswrapper[5008]: I0318 18:26:17.545213 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cc95l" event={"ID":"2716be49-c7ba-47b8-8c63-4e5d9bd7156c","Type":"ContainerStarted","Data":"3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95"} Mar 18 18:26:17 crc kubenswrapper[5008]: I0318 18:26:17.585347 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cc95l" podStartSLOduration=4.057171065 podStartE2EDuration="6.585328521s" podCreationTimestamp="2026-03-18 18:26:11 +0000 UTC" firstStartedPulling="2026-03-18 18:26:14.453708723 +0000 UTC m=+1430.973181802" lastFinishedPulling="2026-03-18 18:26:16.981866169 +0000 UTC m=+1433.501339258" observedRunningTime="2026-03-18 18:26:17.581944941 +0000 UTC m=+1434.101418020" watchObservedRunningTime="2026-03-18 18:26:17.585328521 +0000 UTC m=+1434.104801590" Mar 18 18:26:17 crc kubenswrapper[5008]: I0318 18:26:17.657410 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.203029 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.211001 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719c54e1-6e5f-4769-94b0-c3f329fcf966" path="/var/lib/kubelet/pods/719c54e1-6e5f-4769-94b0-c3f329fcf966/volumes" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.296183 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf2f634-339e-4f92-8046-2786b3e31caf-logs\") pod \"cdf2f634-339e-4f92-8046-2786b3e31caf\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.296372 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhxtb\" (UniqueName: \"kubernetes.io/projected/cdf2f634-339e-4f92-8046-2786b3e31caf-kube-api-access-hhxtb\") pod \"cdf2f634-339e-4f92-8046-2786b3e31caf\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.296472 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-config-data\") pod \"cdf2f634-339e-4f92-8046-2786b3e31caf\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.296498 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-combined-ca-bundle\") pod \"cdf2f634-339e-4f92-8046-2786b3e31caf\" (UID: \"cdf2f634-339e-4f92-8046-2786b3e31caf\") " Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.296864 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf2f634-339e-4f92-8046-2786b3e31caf-logs" (OuterVolumeSpecName: "logs") pod "cdf2f634-339e-4f92-8046-2786b3e31caf" (UID: "cdf2f634-339e-4f92-8046-2786b3e31caf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.297120 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf2f634-339e-4f92-8046-2786b3e31caf-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.311005 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf2f634-339e-4f92-8046-2786b3e31caf-kube-api-access-hhxtb" (OuterVolumeSpecName: "kube-api-access-hhxtb") pod "cdf2f634-339e-4f92-8046-2786b3e31caf" (UID: "cdf2f634-339e-4f92-8046-2786b3e31caf"). InnerVolumeSpecName "kube-api-access-hhxtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.335721 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdf2f634-339e-4f92-8046-2786b3e31caf" (UID: "cdf2f634-339e-4f92-8046-2786b3e31caf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.345102 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-config-data" (OuterVolumeSpecName: "config-data") pod "cdf2f634-339e-4f92-8046-2786b3e31caf" (UID: "cdf2f634-339e-4f92-8046-2786b3e31caf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.399044 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.399072 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf2f634-339e-4f92-8046-2786b3e31caf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.399084 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhxtb\" (UniqueName: \"kubernetes.io/projected/cdf2f634-339e-4f92-8046-2786b3e31caf-kube-api-access-hhxtb\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.556778 5008 generic.go:334] "Generic (PLEG): container finished" podID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerID="f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2" exitCode=0 Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.556831 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.556876 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdf2f634-339e-4f92-8046-2786b3e31caf","Type":"ContainerDied","Data":"f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2"} Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.556920 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdf2f634-339e-4f92-8046-2786b3e31caf","Type":"ContainerDied","Data":"dce5a43499461ffacf95b2032d1d6b0b8034a9a1429756ef2791d656d2fcd38e"} Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.556942 5008 scope.go:117] "RemoveContainer" containerID="f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.559486 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerStarted","Data":"873a27ccb12b2c96aa5aa471d0986768fe52096ef680f141f150859c3abc8386"} Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.605655 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.608292 5008 scope.go:117] "RemoveContainer" containerID="a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.614566 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.632481 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:18 crc kubenswrapper[5008]: E0318 18:26:18.634369 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-api" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.634463 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-api" Mar 18 18:26:18 crc kubenswrapper[5008]: E0318 18:26:18.634522 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-log" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.634615 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-log" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.634842 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-api" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.634906 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" containerName="nova-api-log" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.635187 5008 scope.go:117] "RemoveContainer" containerID="f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2" Mar 18 18:26:18 crc kubenswrapper[5008]: E0318 18:26:18.636775 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2\": container with ID starting with f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2 not found: ID does not exist" containerID="f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.636901 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2"} err="failed to get container status \"f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2\": rpc error: code = NotFound desc = could not find container \"f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2\": container with ID starting with f634b616c95f98c0c1116dd654fd4d78c457ad97057c7814e24970411173efa2 not found: ID does not exist" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.636980 5008 scope.go:117] "RemoveContainer" containerID="a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6" Mar 18 18:26:18 crc kubenswrapper[5008]: E0318 18:26:18.637392 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6\": container with ID starting with a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6 not found: ID does not exist" containerID="a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.637425 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6"} err="failed to get container status \"a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6\": rpc error: code = NotFound desc = could not find container \"a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6\": container with ID starting with a8446aee849d17ed13bd2bcdaf2d2dd2f75588322121bac367467f5be43a74b6 not found: ID does not exist" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.637644 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.644536 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.645233 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.645477 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.647771 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.808672 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.809042 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7z2m\" (UniqueName: \"kubernetes.io/projected/b09e1ec6-0919-4efa-bd93-350edd83b918-kube-api-access-w7z2m\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.809103 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-config-data\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.809138 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b09e1ec6-0919-4efa-bd93-350edd83b918-logs\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.809165 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-public-tls-certs\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.809223 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.910864 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.911114 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7z2m\" (UniqueName: \"kubernetes.io/projected/b09e1ec6-0919-4efa-bd93-350edd83b918-kube-api-access-w7z2m\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.911240 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-config-data\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.911311 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b09e1ec6-0919-4efa-bd93-350edd83b918-logs\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.911394 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-public-tls-certs\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.911485 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.913015 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b09e1ec6-0919-4efa-bd93-350edd83b918-logs\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.916946 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.917524 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.919287 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-public-tls-certs\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.925337 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-config-data\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.930731 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7z2m\" (UniqueName: \"kubernetes.io/projected/b09e1ec6-0919-4efa-bd93-350edd83b918-kube-api-access-w7z2m\") pod \"nova-api-0\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " pod="openstack/nova-api-0" Mar 18 18:26:18 crc kubenswrapper[5008]: I0318 18:26:18.954286 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.111364 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.182602 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:19 crc kubenswrapper[5008]: W0318 18:26:19.515405 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb09e1ec6_0919_4efa_bd93_350edd83b918.slice/crio-9a6f9f8046d52867f4a1dacf4a6b9589f5a190b18a0c0692654bfe585418a2cd WatchSource:0}: Error finding container 9a6f9f8046d52867f4a1dacf4a6b9589f5a190b18a0c0692654bfe585418a2cd: Status 404 returned error can't find the container with id 9a6f9f8046d52867f4a1dacf4a6b9589f5a190b18a0c0692654bfe585418a2cd Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.525833 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.600317 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b09e1ec6-0919-4efa-bd93-350edd83b918","Type":"ContainerStarted","Data":"9a6f9f8046d52867f4a1dacf4a6b9589f5a190b18a0c0692654bfe585418a2cd"} Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.652913 5008 generic.go:334] "Generic (PLEG): container finished" podID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerID="3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d" exitCode=0 Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.653042 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bmh8" event={"ID":"d9fa9d31-a966-44fc-a326-66ed75f7d7bc","Type":"ContainerDied","Data":"3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d"} Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.680123 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerStarted","Data":"05d0b655b1ab89a79198e4909028e56e055ac68df32d81640b553e8b8e2c5ea2"} Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.706918 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.949074 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-g9x6h"] Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.950293 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.956792 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.957042 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 18:26:19 crc kubenswrapper[5008]: I0318 18:26:19.962552 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-g9x6h"] Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.150756 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbpx8\" (UniqueName: \"kubernetes.io/projected/bf33fcec-6589-4b9b-8271-7f51af7ae085-kube-api-access-lbpx8\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.150870 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-scripts\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.150945 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-config-data\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.150967 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.218895 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf2f634-339e-4f92-8046-2786b3e31caf" path="/var/lib/kubelet/pods/cdf2f634-339e-4f92-8046-2786b3e31caf/volumes" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.253346 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-scripts\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.253596 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-config-data\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.253636 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.253668 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbpx8\" (UniqueName: \"kubernetes.io/projected/bf33fcec-6589-4b9b-8271-7f51af7ae085-kube-api-access-lbpx8\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.271168 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-config-data\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.271594 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-scripts\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.274507 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.277903 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbpx8\" (UniqueName: \"kubernetes.io/projected/bf33fcec-6589-4b9b-8271-7f51af7ae085-kube-api-access-lbpx8\") pod \"nova-cell1-cell-mapping-g9x6h\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.279913 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.689399 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bmh8" event={"ID":"d9fa9d31-a966-44fc-a326-66ed75f7d7bc","Type":"ContainerStarted","Data":"7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b"} Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.691745 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerStarted","Data":"be70a582dd90f1296fa65d1eb77e1d949181b975fcb8492f31fade4d248da6eb"} Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.695333 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b09e1ec6-0919-4efa-bd93-350edd83b918","Type":"ContainerStarted","Data":"1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82"} Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.695361 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b09e1ec6-0919-4efa-bd93-350edd83b918","Type":"ContainerStarted","Data":"7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016"} Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.725134 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.725113447 podStartE2EDuration="2.725113447s" podCreationTimestamp="2026-03-18 18:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:26:20.712478712 +0000 UTC m=+1437.231951791" watchObservedRunningTime="2026-03-18 18:26:20.725113447 +0000 UTC m=+1437.244586526" Mar 18 18:26:20 crc kubenswrapper[5008]: I0318 18:26:20.779898 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-g9x6h"] Mar 18 18:26:21 crc kubenswrapper[5008]: I0318 18:26:21.702311 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g9x6h" event={"ID":"bf33fcec-6589-4b9b-8271-7f51af7ae085","Type":"ContainerStarted","Data":"636a655d1d4b129edd75d71f58c73e9276309075062203288ef5db3fbb347e6f"} Mar 18 18:26:21 crc kubenswrapper[5008]: I0318 18:26:21.702823 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g9x6h" event={"ID":"bf33fcec-6589-4b9b-8271-7f51af7ae085","Type":"ContainerStarted","Data":"2341d07dfcabd3bc95af350a11a41b9a6bb0f09abff3dbeb443dafc852152844"} Mar 18 18:26:21 crc kubenswrapper[5008]: I0318 18:26:21.717191 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-g9x6h" podStartSLOduration=2.717171993 podStartE2EDuration="2.717171993s" podCreationTimestamp="2026-03-18 18:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:26:21.714600175 +0000 UTC m=+1438.234073254" watchObservedRunningTime="2026-03-18 18:26:21.717171993 +0000 UTC m=+1438.236645062" Mar 18 18:26:21 crc kubenswrapper[5008]: I0318 18:26:21.736852 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5bmh8" podStartSLOduration=3.083666 podStartE2EDuration="7.736826884s" podCreationTimestamp="2026-03-18 18:26:14 +0000 UTC" firstStartedPulling="2026-03-18 18:26:15.489896029 +0000 UTC m=+1432.009369098" lastFinishedPulling="2026-03-18 18:26:20.143056893 +0000 UTC m=+1436.662529982" observedRunningTime="2026-03-18 18:26:21.732404877 +0000 UTC m=+1438.251877956" watchObservedRunningTime="2026-03-18 18:26:21.736826884 +0000 UTC m=+1438.256299963" Mar 18 18:26:21 crc kubenswrapper[5008]: I0318 18:26:21.917709 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.005121 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-9rfgw"] Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.005334 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" podUID="0e0ed916-3672-43f4-8045-ceab250a8a6f" containerName="dnsmasq-dns" containerID="cri-o://f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1" gracePeriod=10 Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.251697 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.252008 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.299842 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.537055 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.710377 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-nb\") pod \"0e0ed916-3672-43f4-8045-ceab250a8a6f\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.710470 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-sb\") pod \"0e0ed916-3672-43f4-8045-ceab250a8a6f\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.710595 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-swift-storage-0\") pod \"0e0ed916-3672-43f4-8045-ceab250a8a6f\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.710735 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdvfc\" (UniqueName: \"kubernetes.io/projected/0e0ed916-3672-43f4-8045-ceab250a8a6f-kube-api-access-zdvfc\") pod \"0e0ed916-3672-43f4-8045-ceab250a8a6f\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.710770 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-config\") pod \"0e0ed916-3672-43f4-8045-ceab250a8a6f\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.710811 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-svc\") pod \"0e0ed916-3672-43f4-8045-ceab250a8a6f\" (UID: \"0e0ed916-3672-43f4-8045-ceab250a8a6f\") " Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.714181 5008 generic.go:334] "Generic (PLEG): container finished" podID="0e0ed916-3672-43f4-8045-ceab250a8a6f" containerID="f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1" exitCode=0 Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.714224 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" event={"ID":"0e0ed916-3672-43f4-8045-ceab250a8a6f","Type":"ContainerDied","Data":"f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1"} Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.714267 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.714287 5008 scope.go:117] "RemoveContainer" containerID="f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.714274 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b495b9cc7-9rfgw" event={"ID":"0e0ed916-3672-43f4-8045-ceab250a8a6f","Type":"ContainerDied","Data":"a8f0353e965c25ecb1a59228a63208b1a904e8ef21b602771bce7b95c0d17b20"} Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.727506 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0ed916-3672-43f4-8045-ceab250a8a6f-kube-api-access-zdvfc" (OuterVolumeSpecName: "kube-api-access-zdvfc") pod "0e0ed916-3672-43f4-8045-ceab250a8a6f" (UID: "0e0ed916-3672-43f4-8045-ceab250a8a6f"). InnerVolumeSpecName "kube-api-access-zdvfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.800354 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0e0ed916-3672-43f4-8045-ceab250a8a6f" (UID: "0e0ed916-3672-43f4-8045-ceab250a8a6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.810791 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.810985 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0e0ed916-3672-43f4-8045-ceab250a8a6f" (UID: "0e0ed916-3672-43f4-8045-ceab250a8a6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.813225 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0e0ed916-3672-43f4-8045-ceab250a8a6f" (UID: "0e0ed916-3672-43f4-8045-ceab250a8a6f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.813767 5008 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.813784 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdvfc\" (UniqueName: \"kubernetes.io/projected/0e0ed916-3672-43f4-8045-ceab250a8a6f-kube-api-access-zdvfc\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.813796 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.813805 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.823218 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0e0ed916-3672-43f4-8045-ceab250a8a6f" (UID: "0e0ed916-3672-43f4-8045-ceab250a8a6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.829431 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-config" (OuterVolumeSpecName: "config") pod "0e0ed916-3672-43f4-8045-ceab250a8a6f" (UID: "0e0ed916-3672-43f4-8045-ceab250a8a6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.834254 5008 scope.go:117] "RemoveContainer" containerID="8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.876202 5008 scope.go:117] "RemoveContainer" containerID="f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1" Mar 18 18:26:22 crc kubenswrapper[5008]: E0318 18:26:22.878691 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1\": container with ID starting with f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1 not found: ID does not exist" containerID="f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.878742 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1"} err="failed to get container status \"f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1\": rpc error: code = NotFound desc = could not find container \"f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1\": container with ID starting with f25a87e2c6d87129a84288945fd7a42bfe68c586d6ccb3f6767f3d4e3c2246a1 not found: ID does not exist" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.878768 5008 scope.go:117] "RemoveContainer" containerID="8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d" Mar 18 18:26:22 crc kubenswrapper[5008]: E0318 18:26:22.881298 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d\": container with ID starting with 8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d not found: ID does not exist" containerID="8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.881333 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d"} err="failed to get container status \"8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d\": rpc error: code = NotFound desc = could not find container \"8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d\": container with ID starting with 8bb1cebd17e7d2450f86b491860fab53d3e61037e5d2808b683bc3805fd6237d not found: ID does not exist" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.914958 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:22 crc kubenswrapper[5008]: I0318 18:26:22.914981 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0e0ed916-3672-43f4-8045-ceab250a8a6f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:23 crc kubenswrapper[5008]: I0318 18:26:23.082171 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-9rfgw"] Mar 18 18:26:23 crc kubenswrapper[5008]: I0318 18:26:23.090634 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b495b9cc7-9rfgw"] Mar 18 18:26:23 crc kubenswrapper[5008]: I0318 18:26:23.643102 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cc95l"] Mar 18 18:26:23 crc kubenswrapper[5008]: I0318 18:26:23.726668 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerStarted","Data":"427efbf0b58f1a9bc4bca60102f8b2be41b9fbacdae2766e7a982dfbbf642e27"} Mar 18 18:26:23 crc kubenswrapper[5008]: I0318 18:26:23.727045 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="ceilometer-central-agent" containerID="cri-o://873a27ccb12b2c96aa5aa471d0986768fe52096ef680f141f150859c3abc8386" gracePeriod=30 Mar 18 18:26:23 crc kubenswrapper[5008]: I0318 18:26:23.727436 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="proxy-httpd" containerID="cri-o://427efbf0b58f1a9bc4bca60102f8b2be41b9fbacdae2766e7a982dfbbf642e27" gracePeriod=30 Mar 18 18:26:23 crc kubenswrapper[5008]: I0318 18:26:23.727492 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="sg-core" containerID="cri-o://be70a582dd90f1296fa65d1eb77e1d949181b975fcb8492f31fade4d248da6eb" gracePeriod=30 Mar 18 18:26:23 crc kubenswrapper[5008]: I0318 18:26:23.727525 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="ceilometer-notification-agent" containerID="cri-o://05d0b655b1ab89a79198e4909028e56e055ac68df32d81640b553e8b8e2c5ea2" gracePeriod=30 Mar 18 18:26:23 crc kubenswrapper[5008]: I0318 18:26:23.752959 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.315165733 podStartE2EDuration="7.752934873s" podCreationTimestamp="2026-03-18 18:26:16 +0000 UTC" firstStartedPulling="2026-03-18 18:26:17.46950517 +0000 UTC m=+1433.988978249" lastFinishedPulling="2026-03-18 18:26:22.90727431 +0000 UTC m=+1439.426747389" observedRunningTime="2026-03-18 18:26:23.751207277 +0000 UTC m=+1440.270680366" watchObservedRunningTime="2026-03-18 18:26:23.752934873 +0000 UTC m=+1440.272407992" Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.212402 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0ed916-3672-43f4-8045-ceab250a8a6f" path="/var/lib/kubelet/pods/0e0ed916-3672-43f4-8045-ceab250a8a6f/volumes" Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.607387 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.607429 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.737961 5008 generic.go:334] "Generic (PLEG): container finished" podID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerID="427efbf0b58f1a9bc4bca60102f8b2be41b9fbacdae2766e7a982dfbbf642e27" exitCode=0 Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.737991 5008 generic.go:334] "Generic (PLEG): container finished" podID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerID="be70a582dd90f1296fa65d1eb77e1d949181b975fcb8492f31fade4d248da6eb" exitCode=2 Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.737999 5008 generic.go:334] "Generic (PLEG): container finished" podID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerID="05d0b655b1ab89a79198e4909028e56e055ac68df32d81640b553e8b8e2c5ea2" exitCode=0 Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.738054 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerDied","Data":"427efbf0b58f1a9bc4bca60102f8b2be41b9fbacdae2766e7a982dfbbf642e27"} Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.738086 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerDied","Data":"be70a582dd90f1296fa65d1eb77e1d949181b975fcb8492f31fade4d248da6eb"} Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.738097 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerDied","Data":"05d0b655b1ab89a79198e4909028e56e055ac68df32d81640b553e8b8e2c5ea2"} Mar 18 18:26:24 crc kubenswrapper[5008]: I0318 18:26:24.738173 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cc95l" podUID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerName="registry-server" containerID="cri-o://3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95" gracePeriod=2 Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.287849 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.371070 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-catalog-content\") pod \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.371166 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-utilities\") pod \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.371271 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rxm4\" (UniqueName: \"kubernetes.io/projected/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-kube-api-access-6rxm4\") pod \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\" (UID: \"2716be49-c7ba-47b8-8c63-4e5d9bd7156c\") " Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.372020 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-utilities" (OuterVolumeSpecName: "utilities") pod "2716be49-c7ba-47b8-8c63-4e5d9bd7156c" (UID: "2716be49-c7ba-47b8-8c63-4e5d9bd7156c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.383439 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-kube-api-access-6rxm4" (OuterVolumeSpecName: "kube-api-access-6rxm4") pod "2716be49-c7ba-47b8-8c63-4e5d9bd7156c" (UID: "2716be49-c7ba-47b8-8c63-4e5d9bd7156c"). InnerVolumeSpecName "kube-api-access-6rxm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.452258 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2716be49-c7ba-47b8-8c63-4e5d9bd7156c" (UID: "2716be49-c7ba-47b8-8c63-4e5d9bd7156c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.474694 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.474728 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rxm4\" (UniqueName: \"kubernetes.io/projected/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-kube-api-access-6rxm4\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.474741 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2716be49-c7ba-47b8-8c63-4e5d9bd7156c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.657227 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5bmh8" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="registry-server" probeResult="failure" output=< Mar 18 18:26:25 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 18:26:25 crc kubenswrapper[5008]: > Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.752928 5008 generic.go:334] "Generic (PLEG): container finished" podID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerID="873a27ccb12b2c96aa5aa471d0986768fe52096ef680f141f150859c3abc8386" exitCode=0 Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.753016 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerDied","Data":"873a27ccb12b2c96aa5aa471d0986768fe52096ef680f141f150859c3abc8386"} Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.774009 5008 generic.go:334] "Generic (PLEG): container finished" podID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerID="3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95" exitCode=0 Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.774236 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cc95l" event={"ID":"2716be49-c7ba-47b8-8c63-4e5d9bd7156c","Type":"ContainerDied","Data":"3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95"} Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.774261 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cc95l" event={"ID":"2716be49-c7ba-47b8-8c63-4e5d9bd7156c","Type":"ContainerDied","Data":"61dd1ff78cded048717ebec85dbe465d58bbd5bc16910835ce193ef2b552a985"} Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.774307 5008 scope.go:117] "RemoveContainer" containerID="3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.774437 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cc95l" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.809397 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cc95l"] Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.818585 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cc95l"] Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.836761 5008 scope.go:117] "RemoveContainer" containerID="a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.869749 5008 scope.go:117] "RemoveContainer" containerID="f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.916863 5008 scope.go:117] "RemoveContainer" containerID="3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95" Mar 18 18:26:25 crc kubenswrapper[5008]: E0318 18:26:25.917302 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95\": container with ID starting with 3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95 not found: ID does not exist" containerID="3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.917345 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95"} err="failed to get container status \"3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95\": rpc error: code = NotFound desc = could not find container \"3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95\": container with ID starting with 3b6dc5b676228c5ec17a4029fd355b42c61eb3977c4be42f5a8a859b15279d95 not found: ID does not exist" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.917374 5008 scope.go:117] "RemoveContainer" containerID="a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977" Mar 18 18:26:25 crc kubenswrapper[5008]: E0318 18:26:25.917653 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977\": container with ID starting with a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977 not found: ID does not exist" containerID="a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.917687 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977"} err="failed to get container status \"a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977\": rpc error: code = NotFound desc = could not find container \"a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977\": container with ID starting with a3584a34b5206620ba8e5b62fc93248d562cece69bc3ebc04e4726acc40ef977 not found: ID does not exist" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.917707 5008 scope.go:117] "RemoveContainer" containerID="f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158" Mar 18 18:26:25 crc kubenswrapper[5008]: E0318 18:26:25.917921 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158\": container with ID starting with f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158 not found: ID does not exist" containerID="f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.917939 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158"} err="failed to get container status \"f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158\": rpc error: code = NotFound desc = could not find container \"f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158\": container with ID starting with f8ae42b0df6d40272b523b86c73c07745c5f2b4e2ef16bb25015eb98894d1158 not found: ID does not exist" Mar 18 18:26:25 crc kubenswrapper[5008]: I0318 18:26:25.945680 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085238 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-log-httpd\") pod \"120d7db6-88e7-4edf-83bf-ba1c59004db2\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085302 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-sg-core-conf-yaml\") pod \"120d7db6-88e7-4edf-83bf-ba1c59004db2\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085465 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-scripts\") pod \"120d7db6-88e7-4edf-83bf-ba1c59004db2\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085518 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-run-httpd\") pod \"120d7db6-88e7-4edf-83bf-ba1c59004db2\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085576 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-ceilometer-tls-certs\") pod \"120d7db6-88e7-4edf-83bf-ba1c59004db2\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085657 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c964m\" (UniqueName: \"kubernetes.io/projected/120d7db6-88e7-4edf-83bf-ba1c59004db2-kube-api-access-c964m\") pod \"120d7db6-88e7-4edf-83bf-ba1c59004db2\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085690 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-config-data\") pod \"120d7db6-88e7-4edf-83bf-ba1c59004db2\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085754 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-combined-ca-bundle\") pod \"120d7db6-88e7-4edf-83bf-ba1c59004db2\" (UID: \"120d7db6-88e7-4edf-83bf-ba1c59004db2\") " Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085837 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "120d7db6-88e7-4edf-83bf-ba1c59004db2" (UID: "120d7db6-88e7-4edf-83bf-ba1c59004db2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.085945 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "120d7db6-88e7-4edf-83bf-ba1c59004db2" (UID: "120d7db6-88e7-4edf-83bf-ba1c59004db2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.086673 5008 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.086700 5008 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/120d7db6-88e7-4edf-83bf-ba1c59004db2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.090974 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-scripts" (OuterVolumeSpecName: "scripts") pod "120d7db6-88e7-4edf-83bf-ba1c59004db2" (UID: "120d7db6-88e7-4edf-83bf-ba1c59004db2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.091159 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/120d7db6-88e7-4edf-83bf-ba1c59004db2-kube-api-access-c964m" (OuterVolumeSpecName: "kube-api-access-c964m") pod "120d7db6-88e7-4edf-83bf-ba1c59004db2" (UID: "120d7db6-88e7-4edf-83bf-ba1c59004db2"). InnerVolumeSpecName "kube-api-access-c964m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.129136 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "120d7db6-88e7-4edf-83bf-ba1c59004db2" (UID: "120d7db6-88e7-4edf-83bf-ba1c59004db2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.139091 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "120d7db6-88e7-4edf-83bf-ba1c59004db2" (UID: "120d7db6-88e7-4edf-83bf-ba1c59004db2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.188733 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.188772 5008 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.188787 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c964m\" (UniqueName: \"kubernetes.io/projected/120d7db6-88e7-4edf-83bf-ba1c59004db2-kube-api-access-c964m\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.188798 5008 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.202165 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-config-data" (OuterVolumeSpecName: "config-data") pod "120d7db6-88e7-4edf-83bf-ba1c59004db2" (UID: "120d7db6-88e7-4edf-83bf-ba1c59004db2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.202196 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "120d7db6-88e7-4edf-83bf-ba1c59004db2" (UID: "120d7db6-88e7-4edf-83bf-ba1c59004db2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.213473 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" path="/var/lib/kubelet/pods/2716be49-c7ba-47b8-8c63-4e5d9bd7156c/volumes" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.291069 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.291099 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/120d7db6-88e7-4edf-83bf-ba1c59004db2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.799854 5008 generic.go:334] "Generic (PLEG): container finished" podID="bf33fcec-6589-4b9b-8271-7f51af7ae085" containerID="636a655d1d4b129edd75d71f58c73e9276309075062203288ef5db3fbb347e6f" exitCode=0 Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.799894 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g9x6h" event={"ID":"bf33fcec-6589-4b9b-8271-7f51af7ae085","Type":"ContainerDied","Data":"636a655d1d4b129edd75d71f58c73e9276309075062203288ef5db3fbb347e6f"} Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.804593 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"120d7db6-88e7-4edf-83bf-ba1c59004db2","Type":"ContainerDied","Data":"6f7c7e521656e87e8d75593ee7067d79013bde958c97425cdd1ed4d205d765a0"} Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.804631 5008 scope.go:117] "RemoveContainer" containerID="427efbf0b58f1a9bc4bca60102f8b2be41b9fbacdae2766e7a982dfbbf642e27" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.804735 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.833735 5008 scope.go:117] "RemoveContainer" containerID="be70a582dd90f1296fa65d1eb77e1d949181b975fcb8492f31fade4d248da6eb" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.842326 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.858136 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.869118 5008 scope.go:117] "RemoveContainer" containerID="05d0b655b1ab89a79198e4909028e56e055ac68df32d81640b553e8b8e2c5ea2" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.876996 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:26 crc kubenswrapper[5008]: E0318 18:26:26.877417 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerName="extract-utilities" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877434 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerName="extract-utilities" Mar 18 18:26:26 crc kubenswrapper[5008]: E0318 18:26:26.877446 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="proxy-httpd" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877452 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="proxy-httpd" Mar 18 18:26:26 crc kubenswrapper[5008]: E0318 18:26:26.877461 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="sg-core" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877468 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="sg-core" Mar 18 18:26:26 crc kubenswrapper[5008]: E0318 18:26:26.877487 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerName="registry-server" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877492 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerName="registry-server" Mar 18 18:26:26 crc kubenswrapper[5008]: E0318 18:26:26.877505 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerName="extract-content" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877511 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerName="extract-content" Mar 18 18:26:26 crc kubenswrapper[5008]: E0318 18:26:26.877520 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="ceilometer-central-agent" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877526 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="ceilometer-central-agent" Mar 18 18:26:26 crc kubenswrapper[5008]: E0318 18:26:26.877533 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0ed916-3672-43f4-8045-ceab250a8a6f" containerName="dnsmasq-dns" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877539 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0ed916-3672-43f4-8045-ceab250a8a6f" containerName="dnsmasq-dns" Mar 18 18:26:26 crc kubenswrapper[5008]: E0318 18:26:26.877549 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="ceilometer-notification-agent" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877570 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="ceilometer-notification-agent" Mar 18 18:26:26 crc kubenswrapper[5008]: E0318 18:26:26.877585 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0ed916-3672-43f4-8045-ceab250a8a6f" containerName="init" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877593 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0ed916-3672-43f4-8045-ceab250a8a6f" containerName="init" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877794 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="2716be49-c7ba-47b8-8c63-4e5d9bd7156c" containerName="registry-server" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877807 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0ed916-3672-43f4-8045-ceab250a8a6f" containerName="dnsmasq-dns" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877818 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="ceilometer-notification-agent" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877832 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="ceilometer-central-agent" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877839 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="sg-core" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.877848 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" containerName="proxy-httpd" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.879594 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.882134 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.882356 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.882513 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.892397 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:26 crc kubenswrapper[5008]: I0318 18:26:26.902939 5008 scope.go:117] "RemoveContainer" containerID="873a27ccb12b2c96aa5aa471d0986768fe52096ef680f141f150859c3abc8386" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.002421 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-config-data\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.002485 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzh4j\" (UniqueName: \"kubernetes.io/projected/05f0e04a-507a-42ba-97ff-d91aa199b3db-kube-api-access-rzh4j\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.002527 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-log-httpd\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.002642 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.002681 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.002738 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-scripts\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.002766 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.002790 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-run-httpd\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.104738 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.104831 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-scripts\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.104870 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.104909 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-run-httpd\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.104971 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-config-data\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.105029 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzh4j\" (UniqueName: \"kubernetes.io/projected/05f0e04a-507a-42ba-97ff-d91aa199b3db-kube-api-access-rzh4j\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.105053 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-log-httpd\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.105082 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.112886 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-log-httpd\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.113118 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-run-httpd\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.119906 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-scripts\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.120463 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.145837 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.146058 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-config-data\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.151214 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.165330 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzh4j\" (UniqueName: \"kubernetes.io/projected/05f0e04a-507a-42ba-97ff-d91aa199b3db-kube-api-access-rzh4j\") pod \"ceilometer-0\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.202743 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:26:27 crc kubenswrapper[5008]: W0318 18:26:27.670376 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05f0e04a_507a_42ba_97ff_d91aa199b3db.slice/crio-87dbc1eb80f4350eb375a5fad29317eccf81cea784572c73702957ff6a703201 WatchSource:0}: Error finding container 87dbc1eb80f4350eb375a5fad29317eccf81cea784572c73702957ff6a703201: Status 404 returned error can't find the container with id 87dbc1eb80f4350eb375a5fad29317eccf81cea784572c73702957ff6a703201 Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.676445 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:26:27 crc kubenswrapper[5008]: I0318 18:26:27.817357 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerStarted","Data":"87dbc1eb80f4350eb375a5fad29317eccf81cea784572c73702957ff6a703201"} Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.212908 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.221548 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="120d7db6-88e7-4edf-83bf-ba1c59004db2" path="/var/lib/kubelet/pods/120d7db6-88e7-4edf-83bf-ba1c59004db2/volumes" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.331691 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-combined-ca-bundle\") pod \"bf33fcec-6589-4b9b-8271-7f51af7ae085\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.332013 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-config-data\") pod \"bf33fcec-6589-4b9b-8271-7f51af7ae085\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.332044 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbpx8\" (UniqueName: \"kubernetes.io/projected/bf33fcec-6589-4b9b-8271-7f51af7ae085-kube-api-access-lbpx8\") pod \"bf33fcec-6589-4b9b-8271-7f51af7ae085\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.333422 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-scripts\") pod \"bf33fcec-6589-4b9b-8271-7f51af7ae085\" (UID: \"bf33fcec-6589-4b9b-8271-7f51af7ae085\") " Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.336804 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf33fcec-6589-4b9b-8271-7f51af7ae085-kube-api-access-lbpx8" (OuterVolumeSpecName: "kube-api-access-lbpx8") pod "bf33fcec-6589-4b9b-8271-7f51af7ae085" (UID: "bf33fcec-6589-4b9b-8271-7f51af7ae085"). InnerVolumeSpecName "kube-api-access-lbpx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.338739 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-scripts" (OuterVolumeSpecName: "scripts") pod "bf33fcec-6589-4b9b-8271-7f51af7ae085" (UID: "bf33fcec-6589-4b9b-8271-7f51af7ae085"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.363035 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-config-data" (OuterVolumeSpecName: "config-data") pod "bf33fcec-6589-4b9b-8271-7f51af7ae085" (UID: "bf33fcec-6589-4b9b-8271-7f51af7ae085"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.368390 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf33fcec-6589-4b9b-8271-7f51af7ae085" (UID: "bf33fcec-6589-4b9b-8271-7f51af7ae085"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.435775 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.436102 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbpx8\" (UniqueName: \"kubernetes.io/projected/bf33fcec-6589-4b9b-8271-7f51af7ae085-kube-api-access-lbpx8\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.436113 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.436124 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf33fcec-6589-4b9b-8271-7f51af7ae085-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.833795 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g9x6h" event={"ID":"bf33fcec-6589-4b9b-8271-7f51af7ae085","Type":"ContainerDied","Data":"2341d07dfcabd3bc95af350a11a41b9a6bb0f09abff3dbeb443dafc852152844"} Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.833847 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2341d07dfcabd3bc95af350a11a41b9a6bb0f09abff3dbeb443dafc852152844" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.833819 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g9x6h" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.838389 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerStarted","Data":"c59553cd5157024b46624fb3c5ef58bbf4ed0861562e0adaaec3f71273dc5a65"} Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.954924 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.954986 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:26:28 crc kubenswrapper[5008]: I0318 18:26:28.993384 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.003674 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.003903 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="34863810-2f18-4480-a22c-d4a953287b50" containerName="nova-scheduler-scheduler" containerID="cri-o://59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914" gracePeriod=30 Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.068731 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.069011 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-log" containerID="cri-o://91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617" gracePeriod=30 Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.069039 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-metadata" containerID="cri-o://bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0" gracePeriod=30 Mar 18 18:26:29 crc kubenswrapper[5008]: E0318 18:26:29.542606 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:26:29 crc kubenswrapper[5008]: E0318 18:26:29.544109 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:26:29 crc kubenswrapper[5008]: E0318 18:26:29.545307 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:26:29 crc kubenswrapper[5008]: E0318 18:26:29.545348 5008 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="34863810-2f18-4480-a22c-d4a953287b50" containerName="nova-scheduler-scheduler" Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.850262 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerStarted","Data":"9c0e2fb1769c01fdd6e7cf853c2f7ba8ba4a7c7185217812a0c6f433539ebd75"} Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.852491 5008 generic.go:334] "Generic (PLEG): container finished" podID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerID="91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617" exitCode=143 Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.852581 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff482a80-2d35-4ec7-a524-67c8bbf33b5f","Type":"ContainerDied","Data":"91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617"} Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.852699 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-log" containerID="cri-o://7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016" gracePeriod=30 Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.852746 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-api" containerID="cri-o://1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82" gracePeriod=30 Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.858091 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": EOF" Mar 18 18:26:29 crc kubenswrapper[5008]: I0318 18:26:29.858136 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": EOF" Mar 18 18:26:30 crc kubenswrapper[5008]: I0318 18:26:30.862089 5008 generic.go:334] "Generic (PLEG): container finished" podID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerID="7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016" exitCode=143 Mar 18 18:26:30 crc kubenswrapper[5008]: I0318 18:26:30.862165 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b09e1ec6-0919-4efa-bd93-350edd83b918","Type":"ContainerDied","Data":"7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016"} Mar 18 18:26:30 crc kubenswrapper[5008]: I0318 18:26:30.864956 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerStarted","Data":"3a5bb1f257b5ccc8acb612686fc959dffa3d2d0416f76fda835c27174ee06b01"} Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.614056 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.718782 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x4kq\" (UniqueName: \"kubernetes.io/projected/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-kube-api-access-4x4kq\") pod \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.718857 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-config-data\") pod \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.718924 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-logs\") pod \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.719003 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-nova-metadata-tls-certs\") pod \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.719061 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-combined-ca-bundle\") pod \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\" (UID: \"ff482a80-2d35-4ec7-a524-67c8bbf33b5f\") " Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.722702 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-logs" (OuterVolumeSpecName: "logs") pod "ff482a80-2d35-4ec7-a524-67c8bbf33b5f" (UID: "ff482a80-2d35-4ec7-a524-67c8bbf33b5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.724066 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-kube-api-access-4x4kq" (OuterVolumeSpecName: "kube-api-access-4x4kq") pod "ff482a80-2d35-4ec7-a524-67c8bbf33b5f" (UID: "ff482a80-2d35-4ec7-a524-67c8bbf33b5f"). InnerVolumeSpecName "kube-api-access-4x4kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.750106 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff482a80-2d35-4ec7-a524-67c8bbf33b5f" (UID: "ff482a80-2d35-4ec7-a524-67c8bbf33b5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.767784 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-config-data" (OuterVolumeSpecName: "config-data") pod "ff482a80-2d35-4ec7-a524-67c8bbf33b5f" (UID: "ff482a80-2d35-4ec7-a524-67c8bbf33b5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.779420 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ff482a80-2d35-4ec7-a524-67c8bbf33b5f" (UID: "ff482a80-2d35-4ec7-a524-67c8bbf33b5f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.821219 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.821255 5008 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.821266 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.821274 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x4kq\" (UniqueName: \"kubernetes.io/projected/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-kube-api-access-4x4kq\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.821282 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff482a80-2d35-4ec7-a524-67c8bbf33b5f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.891786 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerStarted","Data":"9af388372b24fa973623ad52fe731bca5ce89b4e94b9e0a247770b6b45d5bade"} Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.891929 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.896354 5008 generic.go:334] "Generic (PLEG): container finished" podID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerID="bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0" exitCode=0 Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.896405 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff482a80-2d35-4ec7-a524-67c8bbf33b5f","Type":"ContainerDied","Data":"bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0"} Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.896421 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.896445 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff482a80-2d35-4ec7-a524-67c8bbf33b5f","Type":"ContainerDied","Data":"4447c98e6b6909125a900de89d1e62293b6868faa745aabfa1b0eda5c388ab79"} Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.896463 5008 scope.go:117] "RemoveContainer" containerID="bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.921498 5008 scope.go:117] "RemoveContainer" containerID="91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.935427 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.775580285 podStartE2EDuration="6.935407838s" podCreationTimestamp="2026-03-18 18:26:26 +0000 UTC" firstStartedPulling="2026-03-18 18:26:27.672993198 +0000 UTC m=+1444.192466267" lastFinishedPulling="2026-03-18 18:26:31.832820741 +0000 UTC m=+1448.352293820" observedRunningTime="2026-03-18 18:26:32.913371744 +0000 UTC m=+1449.432844823" watchObservedRunningTime="2026-03-18 18:26:32.935407838 +0000 UTC m=+1449.454880917" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.949543 5008 scope.go:117] "RemoveContainer" containerID="bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0" Mar 18 18:26:32 crc kubenswrapper[5008]: E0318 18:26:32.951707 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0\": container with ID starting with bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0 not found: ID does not exist" containerID="bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.951752 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0"} err="failed to get container status \"bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0\": rpc error: code = NotFound desc = could not find container \"bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0\": container with ID starting with bedae4070f7dbf066a96be1c8f653bd332dd24ec8bc83fc68ab873cfa86082d0 not found: ID does not exist" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.951777 5008 scope.go:117] "RemoveContainer" containerID="91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617" Mar 18 18:26:32 crc kubenswrapper[5008]: E0318 18:26:32.952182 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617\": container with ID starting with 91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617 not found: ID does not exist" containerID="91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.952234 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617"} err="failed to get container status \"91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617\": rpc error: code = NotFound desc = could not find container \"91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617\": container with ID starting with 91d5e118aa7398732eea62cdb18d46fc81b2f3f3a83f501949addecd71445617 not found: ID does not exist" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.952758 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.963115 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.971265 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:26:32 crc kubenswrapper[5008]: E0318 18:26:32.971766 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-log" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.971787 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-log" Mar 18 18:26:32 crc kubenswrapper[5008]: E0318 18:26:32.971814 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-metadata" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.971823 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-metadata" Mar 18 18:26:32 crc kubenswrapper[5008]: E0318 18:26:32.971840 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf33fcec-6589-4b9b-8271-7f51af7ae085" containerName="nova-manage" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.971848 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf33fcec-6589-4b9b-8271-7f51af7ae085" containerName="nova-manage" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.972098 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf33fcec-6589-4b9b-8271-7f51af7ae085" containerName="nova-manage" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.972126 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-metadata" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.972147 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" containerName="nova-metadata-log" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.973324 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.976936 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.977053 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 18:26:32 crc kubenswrapper[5008]: I0318 18:26:32.979371 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.024827 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6hx7\" (UniqueName: \"kubernetes.io/projected/96efea0e-17ae-49c4-8f5c-b7341def6878-kube-api-access-r6hx7\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.025437 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.025582 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.025704 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-config-data\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.025794 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96efea0e-17ae-49c4-8f5c-b7341def6878-logs\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.127912 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.128129 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.128216 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-config-data\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.128291 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96efea0e-17ae-49c4-8f5c-b7341def6878-logs\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.128361 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6hx7\" (UniqueName: \"kubernetes.io/projected/96efea0e-17ae-49c4-8f5c-b7341def6878-kube-api-access-r6hx7\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.128721 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96efea0e-17ae-49c4-8f5c-b7341def6878-logs\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.132707 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.134627 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.140604 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-config-data\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.174648 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6hx7\" (UniqueName: \"kubernetes.io/projected/96efea0e-17ae-49c4-8f5c-b7341def6878-kube-api-access-r6hx7\") pod \"nova-metadata-0\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.296460 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.680861 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.768541 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk2nj\" (UniqueName: \"kubernetes.io/projected/34863810-2f18-4480-a22c-d4a953287b50-kube-api-access-rk2nj\") pod \"34863810-2f18-4480-a22c-d4a953287b50\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.768662 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-combined-ca-bundle\") pod \"34863810-2f18-4480-a22c-d4a953287b50\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.768780 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-config-data\") pod \"34863810-2f18-4480-a22c-d4a953287b50\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.775468 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34863810-2f18-4480-a22c-d4a953287b50-kube-api-access-rk2nj" (OuterVolumeSpecName: "kube-api-access-rk2nj") pod "34863810-2f18-4480-a22c-d4a953287b50" (UID: "34863810-2f18-4480-a22c-d4a953287b50"). InnerVolumeSpecName "kube-api-access-rk2nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:33 crc kubenswrapper[5008]: E0318 18:26:33.797147 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-combined-ca-bundle podName:34863810-2f18-4480-a22c-d4a953287b50 nodeName:}" failed. No retries permitted until 2026-03-18 18:26:34.297120108 +0000 UTC m=+1450.816593187 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-combined-ca-bundle") pod "34863810-2f18-4480-a22c-d4a953287b50" (UID: "34863810-2f18-4480-a22c-d4a953287b50") : error deleting /var/lib/kubelet/pods/34863810-2f18-4480-a22c-d4a953287b50/volume-subpaths: remove /var/lib/kubelet/pods/34863810-2f18-4480-a22c-d4a953287b50/volume-subpaths: no such file or directory Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.804130 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-config-data" (OuterVolumeSpecName: "config-data") pod "34863810-2f18-4480-a22c-d4a953287b50" (UID: "34863810-2f18-4480-a22c-d4a953287b50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.871301 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.871428 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk2nj\" (UniqueName: \"kubernetes.io/projected/34863810-2f18-4480-a22c-d4a953287b50-kube-api-access-rk2nj\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.872011 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:26:33 crc kubenswrapper[5008]: W0318 18:26:33.872614 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96efea0e_17ae_49c4_8f5c_b7341def6878.slice/crio-7540b2a740d04535fe5ca22edb79d9cf3b3ad9c77e4251988ad1a5c36720936c WatchSource:0}: Error finding container 7540b2a740d04535fe5ca22edb79d9cf3b3ad9c77e4251988ad1a5c36720936c: Status 404 returned error can't find the container with id 7540b2a740d04535fe5ca22edb79d9cf3b3ad9c77e4251988ad1a5c36720936c Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.906369 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96efea0e-17ae-49c4-8f5c-b7341def6878","Type":"ContainerStarted","Data":"7540b2a740d04535fe5ca22edb79d9cf3b3ad9c77e4251988ad1a5c36720936c"} Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.908241 5008 generic.go:334] "Generic (PLEG): container finished" podID="34863810-2f18-4480-a22c-d4a953287b50" containerID="59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914" exitCode=0 Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.908295 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"34863810-2f18-4480-a22c-d4a953287b50","Type":"ContainerDied","Data":"59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914"} Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.908326 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"34863810-2f18-4480-a22c-d4a953287b50","Type":"ContainerDied","Data":"96096ace69f382aba16f4e474c79515fd73616426d1728118c9a2a887a43e3f2"} Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.908281 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.908355 5008 scope.go:117] "RemoveContainer" containerID="59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.996858 5008 scope.go:117] "RemoveContainer" containerID="59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914" Mar 18 18:26:33 crc kubenswrapper[5008]: E0318 18:26:33.997289 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914\": container with ID starting with 59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914 not found: ID does not exist" containerID="59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914" Mar 18 18:26:33 crc kubenswrapper[5008]: I0318 18:26:33.997321 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914"} err="failed to get container status \"59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914\": rpc error: code = NotFound desc = could not find container \"59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914\": container with ID starting with 59c00d4d30b5ad67e84ae5b9546a05127b5e01758c72ab5fcaffc173b204d914 not found: ID does not exist" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.255248 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff482a80-2d35-4ec7-a524-67c8bbf33b5f" path="/var/lib/kubelet/pods/ff482a80-2d35-4ec7-a524-67c8bbf33b5f/volumes" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.393635 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-combined-ca-bundle\") pod \"34863810-2f18-4480-a22c-d4a953287b50\" (UID: \"34863810-2f18-4480-a22c-d4a953287b50\") " Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.396629 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34863810-2f18-4480-a22c-d4a953287b50" (UID: "34863810-2f18-4480-a22c-d4a953287b50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.495825 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34863810-2f18-4480-a22c-d4a953287b50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.541349 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.552477 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.566989 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:26:34 crc kubenswrapper[5008]: E0318 18:26:34.568029 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34863810-2f18-4480-a22c-d4a953287b50" containerName="nova-scheduler-scheduler" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.568152 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="34863810-2f18-4480-a22c-d4a953287b50" containerName="nova-scheduler-scheduler" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.568510 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="34863810-2f18-4480-a22c-d4a953287b50" containerName="nova-scheduler-scheduler" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.569365 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.571681 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.576039 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.699198 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-config-data\") pod \"nova-scheduler-0\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.699249 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.699317 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdh28\" (UniqueName: \"kubernetes.io/projected/f26207e6-102f-4160-be7d-e1cad865fcc6-kube-api-access-pdh28\") pod \"nova-scheduler-0\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.800992 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.801126 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdh28\" (UniqueName: \"kubernetes.io/projected/f26207e6-102f-4160-be7d-e1cad865fcc6-kube-api-access-pdh28\") pod \"nova-scheduler-0\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.801238 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-config-data\") pod \"nova-scheduler-0\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.805940 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-config-data\") pod \"nova-scheduler-0\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.807701 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.829412 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdh28\" (UniqueName: \"kubernetes.io/projected/f26207e6-102f-4160-be7d-e1cad865fcc6-kube-api-access-pdh28\") pod \"nova-scheduler-0\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.922421 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96efea0e-17ae-49c4-8f5c-b7341def6878","Type":"ContainerStarted","Data":"2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931"} Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.922494 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96efea0e-17ae-49c4-8f5c-b7341def6878","Type":"ContainerStarted","Data":"ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4"} Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.928961 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:26:34 crc kubenswrapper[5008]: I0318 18:26:34.965811 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.965736144 podStartE2EDuration="2.965736144s" podCreationTimestamp="2026-03-18 18:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:26:34.962081538 +0000 UTC m=+1451.481554617" watchObservedRunningTime="2026-03-18 18:26:34.965736144 +0000 UTC m=+1451.485209263" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.406063 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.582418 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.668460 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5bmh8" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="registry-server" probeResult="failure" output=< Mar 18 18:26:35 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 18:26:35 crc kubenswrapper[5008]: > Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.724591 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b09e1ec6-0919-4efa-bd93-350edd83b918-logs\") pod \"b09e1ec6-0919-4efa-bd93-350edd83b918\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.724693 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-combined-ca-bundle\") pod \"b09e1ec6-0919-4efa-bd93-350edd83b918\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.724714 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-public-tls-certs\") pod \"b09e1ec6-0919-4efa-bd93-350edd83b918\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.724775 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7z2m\" (UniqueName: \"kubernetes.io/projected/b09e1ec6-0919-4efa-bd93-350edd83b918-kube-api-access-w7z2m\") pod \"b09e1ec6-0919-4efa-bd93-350edd83b918\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.725231 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09e1ec6-0919-4efa-bd93-350edd83b918-logs" (OuterVolumeSpecName: "logs") pod "b09e1ec6-0919-4efa-bd93-350edd83b918" (UID: "b09e1ec6-0919-4efa-bd93-350edd83b918"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.725337 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-config-data\") pod \"b09e1ec6-0919-4efa-bd93-350edd83b918\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.725365 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-internal-tls-certs\") pod \"b09e1ec6-0919-4efa-bd93-350edd83b918\" (UID: \"b09e1ec6-0919-4efa-bd93-350edd83b918\") " Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.725778 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b09e1ec6-0919-4efa-bd93-350edd83b918-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.728708 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09e1ec6-0919-4efa-bd93-350edd83b918-kube-api-access-w7z2m" (OuterVolumeSpecName: "kube-api-access-w7z2m") pod "b09e1ec6-0919-4efa-bd93-350edd83b918" (UID: "b09e1ec6-0919-4efa-bd93-350edd83b918"). InnerVolumeSpecName "kube-api-access-w7z2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.751455 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-config-data" (OuterVolumeSpecName: "config-data") pod "b09e1ec6-0919-4efa-bd93-350edd83b918" (UID: "b09e1ec6-0919-4efa-bd93-350edd83b918"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.774131 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b09e1ec6-0919-4efa-bd93-350edd83b918" (UID: "b09e1ec6-0919-4efa-bd93-350edd83b918"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.777952 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b09e1ec6-0919-4efa-bd93-350edd83b918" (UID: "b09e1ec6-0919-4efa-bd93-350edd83b918"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.784697 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b09e1ec6-0919-4efa-bd93-350edd83b918" (UID: "b09e1ec6-0919-4efa-bd93-350edd83b918"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.827804 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.827839 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.827848 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7z2m\" (UniqueName: \"kubernetes.io/projected/b09e1ec6-0919-4efa-bd93-350edd83b918-kube-api-access-w7z2m\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.827862 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.827871 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b09e1ec6-0919-4efa-bd93-350edd83b918-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.935900 5008 generic.go:334] "Generic (PLEG): container finished" podID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerID="1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82" exitCode=0 Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.936025 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.936959 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b09e1ec6-0919-4efa-bd93-350edd83b918","Type":"ContainerDied","Data":"1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82"} Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.937020 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b09e1ec6-0919-4efa-bd93-350edd83b918","Type":"ContainerDied","Data":"9a6f9f8046d52867f4a1dacf4a6b9589f5a190b18a0c0692654bfe585418a2cd"} Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.937044 5008 scope.go:117] "RemoveContainer" containerID="1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.942255 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f26207e6-102f-4160-be7d-e1cad865fcc6","Type":"ContainerStarted","Data":"2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1"} Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.942302 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f26207e6-102f-4160-be7d-e1cad865fcc6","Type":"ContainerStarted","Data":"a545237e6de452fad29c24896e2bdd27a03ed64885aa9d48241b069b1c051e36"} Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.974823 5008 scope.go:117] "RemoveContainer" containerID="7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016" Mar 18 18:26:35 crc kubenswrapper[5008]: I0318 18:26:35.980065 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9800376100000001 podStartE2EDuration="1.98003761s" podCreationTimestamp="2026-03-18 18:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:26:35.959211198 +0000 UTC m=+1452.478684277" watchObservedRunningTime="2026-03-18 18:26:35.98003761 +0000 UTC m=+1452.499510709" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.004065 5008 scope.go:117] "RemoveContainer" containerID="1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82" Mar 18 18:26:36 crc kubenswrapper[5008]: E0318 18:26:36.005807 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82\": container with ID starting with 1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82 not found: ID does not exist" containerID="1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.005838 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82"} err="failed to get container status \"1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82\": rpc error: code = NotFound desc = could not find container \"1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82\": container with ID starting with 1a104a8415c8a7628d605ed30bfd389a15babac32480a323563573fe1d4dbc82 not found: ID does not exist" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.005860 5008 scope.go:117] "RemoveContainer" containerID="7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016" Mar 18 18:26:36 crc kubenswrapper[5008]: E0318 18:26:36.006160 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016\": container with ID starting with 7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016 not found: ID does not exist" containerID="7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.006209 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016"} err="failed to get container status \"7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016\": rpc error: code = NotFound desc = could not find container \"7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016\": container with ID starting with 7d776ea979c029cd993818fea176393bb6b60d1801ccb1f9b619b54c5338d016 not found: ID does not exist" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.007759 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.018762 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.036650 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:36 crc kubenswrapper[5008]: E0318 18:26:36.037082 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-log" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.037100 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-log" Mar 18 18:26:36 crc kubenswrapper[5008]: E0318 18:26:36.037121 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-api" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.037129 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-api" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.037297 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-api" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.037317 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" containerName="nova-api-log" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.039031 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.039150 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.042878 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.043714 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.043796 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.140442 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-config-data\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.140513 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.140573 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.140793 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-public-tls-certs\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.141364 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda3600a-d612-43ec-8b45-77eccc420b0f-logs\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.141564 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxw6x\" (UniqueName: \"kubernetes.io/projected/bda3600a-d612-43ec-8b45-77eccc420b0f-kube-api-access-bxw6x\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.215175 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34863810-2f18-4480-a22c-d4a953287b50" path="/var/lib/kubelet/pods/34863810-2f18-4480-a22c-d4a953287b50/volumes" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.215997 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b09e1ec6-0919-4efa-bd93-350edd83b918" path="/var/lib/kubelet/pods/b09e1ec6-0919-4efa-bd93-350edd83b918/volumes" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.243909 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.243989 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-public-tls-certs\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.244072 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda3600a-d612-43ec-8b45-77eccc420b0f-logs\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.244123 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxw6x\" (UniqueName: \"kubernetes.io/projected/bda3600a-d612-43ec-8b45-77eccc420b0f-kube-api-access-bxw6x\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.244152 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-config-data\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.244209 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.245825 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda3600a-d612-43ec-8b45-77eccc420b0f-logs\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.251703 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-config-data\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.251776 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-public-tls-certs\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.251812 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.261038 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.262468 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxw6x\" (UniqueName: \"kubernetes.io/projected/bda3600a-d612-43ec-8b45-77eccc420b0f-kube-api-access-bxw6x\") pod \"nova-api-0\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.354165 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.879603 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:26:36 crc kubenswrapper[5008]: I0318 18:26:36.957680 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bda3600a-d612-43ec-8b45-77eccc420b0f","Type":"ContainerStarted","Data":"2fabe4b5ebd4874832d3f316148bf564ee1303e60043065e81a128ff85cd3852"} Mar 18 18:26:37 crc kubenswrapper[5008]: I0318 18:26:37.395690 5008 scope.go:117] "RemoveContainer" containerID="7139634e0568e4bad331a4250ab028a69538c5ecaa474e7a149aec84cf48a5a5" Mar 18 18:26:37 crc kubenswrapper[5008]: I0318 18:26:37.966820 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bda3600a-d612-43ec-8b45-77eccc420b0f","Type":"ContainerStarted","Data":"4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd"} Mar 18 18:26:37 crc kubenswrapper[5008]: I0318 18:26:37.966874 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bda3600a-d612-43ec-8b45-77eccc420b0f","Type":"ContainerStarted","Data":"b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2"} Mar 18 18:26:37 crc kubenswrapper[5008]: I0318 18:26:37.997866 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.997847185 podStartE2EDuration="2.997847185s" podCreationTimestamp="2026-03-18 18:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 18:26:37.983935846 +0000 UTC m=+1454.503408955" watchObservedRunningTime="2026-03-18 18:26:37.997847185 +0000 UTC m=+1454.517320264" Mar 18 18:26:39 crc kubenswrapper[5008]: I0318 18:26:39.930065 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 18:26:43 crc kubenswrapper[5008]: I0318 18:26:43.297437 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:26:43 crc kubenswrapper[5008]: I0318 18:26:43.297871 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 18:26:44 crc kubenswrapper[5008]: I0318 18:26:44.314794 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:26:44 crc kubenswrapper[5008]: I0318 18:26:44.314794 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:26:44 crc kubenswrapper[5008]: I0318 18:26:44.662442 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:44 crc kubenswrapper[5008]: I0318 18:26:44.742980 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:44 crc kubenswrapper[5008]: I0318 18:26:44.929264 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 18:26:44 crc kubenswrapper[5008]: I0318 18:26:44.958174 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 18:26:45 crc kubenswrapper[5008]: I0318 18:26:45.066540 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 18:26:45 crc kubenswrapper[5008]: I0318 18:26:45.459009 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5bmh8"] Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.043365 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5bmh8" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="registry-server" containerID="cri-o://7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b" gracePeriod=2 Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.354739 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.355048 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.627980 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.783595 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-catalog-content\") pod \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.783694 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-utilities\") pod \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.783843 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jncsl\" (UniqueName: \"kubernetes.io/projected/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-kube-api-access-jncsl\") pod \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\" (UID: \"d9fa9d31-a966-44fc-a326-66ed75f7d7bc\") " Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.784479 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-utilities" (OuterVolumeSpecName: "utilities") pod "d9fa9d31-a966-44fc-a326-66ed75f7d7bc" (UID: "d9fa9d31-a966-44fc-a326-66ed75f7d7bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.791624 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-kube-api-access-jncsl" (OuterVolumeSpecName: "kube-api-access-jncsl") pod "d9fa9d31-a966-44fc-a326-66ed75f7d7bc" (UID: "d9fa9d31-a966-44fc-a326-66ed75f7d7bc"). InnerVolumeSpecName "kube-api-access-jncsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.892785 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.892819 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jncsl\" (UniqueName: \"kubernetes.io/projected/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-kube-api-access-jncsl\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.958126 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9fa9d31-a966-44fc-a326-66ed75f7d7bc" (UID: "d9fa9d31-a966-44fc-a326-66ed75f7d7bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:26:46 crc kubenswrapper[5008]: I0318 18:26:46.994168 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fa9d31-a966-44fc-a326-66ed75f7d7bc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.061328 5008 generic.go:334] "Generic (PLEG): container finished" podID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerID="7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b" exitCode=0 Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.061380 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bmh8" event={"ID":"d9fa9d31-a966-44fc-a326-66ed75f7d7bc","Type":"ContainerDied","Data":"7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b"} Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.061412 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bmh8" event={"ID":"d9fa9d31-a966-44fc-a326-66ed75f7d7bc","Type":"ContainerDied","Data":"4a62c8e978e0f88ecd969d80f1dd20cd2532c82d3d6c31718ef6486921e17402"} Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.061436 5008 scope.go:117] "RemoveContainer" containerID="7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.061607 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bmh8" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.101941 5008 scope.go:117] "RemoveContainer" containerID="3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.134603 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5bmh8"] Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.142841 5008 scope.go:117] "RemoveContainer" containerID="b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.144506 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5bmh8"] Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.174683 5008 scope.go:117] "RemoveContainer" containerID="7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b" Mar 18 18:26:47 crc kubenswrapper[5008]: E0318 18:26:47.175212 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b\": container with ID starting with 7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b not found: ID does not exist" containerID="7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.175254 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b"} err="failed to get container status \"7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b\": rpc error: code = NotFound desc = could not find container \"7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b\": container with ID starting with 7cae9446f342ae7835cb2dea80dde05cc0764e5833362e5f1bed967c1766384b not found: ID does not exist" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.175281 5008 scope.go:117] "RemoveContainer" containerID="3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d" Mar 18 18:26:47 crc kubenswrapper[5008]: E0318 18:26:47.175604 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d\": container with ID starting with 3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d not found: ID does not exist" containerID="3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.175624 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d"} err="failed to get container status \"3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d\": rpc error: code = NotFound desc = could not find container \"3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d\": container with ID starting with 3d590dc2419535af09f4ad93769ea53d4eb22eb4d17c67bc594495e674d2ce9d not found: ID does not exist" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.175636 5008 scope.go:117] "RemoveContainer" containerID="b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391" Mar 18 18:26:47 crc kubenswrapper[5008]: E0318 18:26:47.176074 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391\": container with ID starting with b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391 not found: ID does not exist" containerID="b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.176130 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391"} err="failed to get container status \"b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391\": rpc error: code = NotFound desc = could not find container \"b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391\": container with ID starting with b1fe00ee92bf5415e3aa397735629332885b143f8ea000da5768f6d5cd379391 not found: ID does not exist" Mar 18 18:26:47 crc kubenswrapper[5008]: E0318 18:26:47.279270 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9fa9d31_a966_44fc_a326_66ed75f7d7bc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9fa9d31_a966_44fc_a326_66ed75f7d7bc.slice/crio-4a62c8e978e0f88ecd969d80f1dd20cd2532c82d3d6c31718ef6486921e17402\": RecentStats: unable to find data in memory cache]" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.367698 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:26:47 crc kubenswrapper[5008]: I0318 18:26:47.367772 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 18:26:48 crc kubenswrapper[5008]: I0318 18:26:48.214949 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" path="/var/lib/kubelet/pods/d9fa9d31-a966-44fc-a326-66ed75f7d7bc/volumes" Mar 18 18:26:51 crc kubenswrapper[5008]: I0318 18:26:51.296962 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:26:51 crc kubenswrapper[5008]: I0318 18:26:51.299228 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 18:26:53 crc kubenswrapper[5008]: I0318 18:26:53.303589 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:26:53 crc kubenswrapper[5008]: I0318 18:26:53.306512 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 18:26:53 crc kubenswrapper[5008]: I0318 18:26:53.318612 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:26:54 crc kubenswrapper[5008]: I0318 18:26:54.142846 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 18:26:54 crc kubenswrapper[5008]: I0318 18:26:54.354439 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:26:54 crc kubenswrapper[5008]: I0318 18:26:54.354495 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 18:26:56 crc kubenswrapper[5008]: I0318 18:26:56.381726 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:26:56 crc kubenswrapper[5008]: I0318 18:26:56.384842 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 18:26:56 crc kubenswrapper[5008]: I0318 18:26:56.396481 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:26:57 crc kubenswrapper[5008]: I0318 18:26:57.169931 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 18:26:57 crc kubenswrapper[5008]: I0318 18:26:57.215535 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.820228 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.820961 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="1e0cfb4d-8438-45bc-882c-27c9544b40a5" containerName="openstackclient" containerID="cri-o://3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959" gracePeriod=2 Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.829740 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.992619 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-j5dtq"] Mar 18 18:27:18 crc kubenswrapper[5008]: E0318 18:27:18.993435 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="extract-utilities" Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.993448 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="extract-utilities" Mar 18 18:27:18 crc kubenswrapper[5008]: E0318 18:27:18.993478 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="extract-content" Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.993484 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="extract-content" Mar 18 18:27:18 crc kubenswrapper[5008]: E0318 18:27:18.993505 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="registry-server" Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.993511 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="registry-server" Mar 18 18:27:18 crc kubenswrapper[5008]: E0318 18:27:18.993529 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0cfb4d-8438-45bc-882c-27c9544b40a5" containerName="openstackclient" Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.993534 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0cfb4d-8438-45bc-882c-27c9544b40a5" containerName="openstackclient" Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.993868 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0cfb4d-8438-45bc-882c-27c9544b40a5" containerName="openstackclient" Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.993902 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9fa9d31-a966-44fc-a326-66ed75f7d7bc" containerName="registry-server" Mar 18 18:27:18 crc kubenswrapper[5008]: I0318 18:27:18.994659 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.001727 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.041636 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j5dtq"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.087664 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8wg\" (UniqueName: \"kubernetes.io/projected/8724ccad-851e-4efc-ad3c-d34252a3f29f-kube-api-access-5v8wg\") pod \"root-account-create-update-j5dtq\" (UID: \"8724ccad-851e-4efc-ad3c-d34252a3f29f\") " pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.087776 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts\") pod \"root-account-create-update-j5dtq\" (UID: \"8724ccad-851e-4efc-ad3c-d34252a3f29f\") " pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.092613 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-79e6-account-create-update-jqqxt"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.094119 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.102875 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.187031 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8wr5b"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.190044 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v8wg\" (UniqueName: \"kubernetes.io/projected/8724ccad-851e-4efc-ad3c-d34252a3f29f-kube-api-access-5v8wg\") pod \"root-account-create-update-j5dtq\" (UID: \"8724ccad-851e-4efc-ad3c-d34252a3f29f\") " pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.190155 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7csbz\" (UniqueName: \"kubernetes.io/projected/1d951b25-e886-44c9-b7f7-d60853e1e0a9-kube-api-access-7csbz\") pod \"barbican-79e6-account-create-update-jqqxt\" (UID: \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\") " pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.190181 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts\") pod \"root-account-create-update-j5dtq\" (UID: \"8724ccad-851e-4efc-ad3c-d34252a3f29f\") " pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.190210 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d951b25-e886-44c9-b7f7-d60853e1e0a9-operator-scripts\") pod \"barbican-79e6-account-create-update-jqqxt\" (UID: \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\") " pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.191138 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts\") pod \"root-account-create-update-j5dtq\" (UID: \"8724ccad-851e-4efc-ad3c-d34252a3f29f\") " pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.209238 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8wr5b"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.235375 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v8wg\" (UniqueName: \"kubernetes.io/projected/8724ccad-851e-4efc-ad3c-d34252a3f29f-kube-api-access-5v8wg\") pod \"root-account-create-update-j5dtq\" (UID: \"8724ccad-851e-4efc-ad3c-d34252a3f29f\") " pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.243985 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-79e6-account-create-update-jqqxt"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.257431 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9abb-account-create-update-z2jlj"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.258648 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.261098 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.293568 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7csbz\" (UniqueName: \"kubernetes.io/projected/1d951b25-e886-44c9-b7f7-d60853e1e0a9-kube-api-access-7csbz\") pod \"barbican-79e6-account-create-update-jqqxt\" (UID: \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\") " pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.293627 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb8b8df-bedb-4e35-b709-25e83be00470-operator-scripts\") pod \"neutron-9abb-account-create-update-z2jlj\" (UID: \"efb8b8df-bedb-4e35-b709-25e83be00470\") " pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.293648 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrpss\" (UniqueName: \"kubernetes.io/projected/efb8b8df-bedb-4e35-b709-25e83be00470-kube-api-access-vrpss\") pod \"neutron-9abb-account-create-update-z2jlj\" (UID: \"efb8b8df-bedb-4e35-b709-25e83be00470\") " pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.293669 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d951b25-e886-44c9-b7f7-d60853e1e0a9-operator-scripts\") pod \"barbican-79e6-account-create-update-jqqxt\" (UID: \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\") " pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.294480 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d951b25-e886-44c9-b7f7-d60853e1e0a9-operator-scripts\") pod \"barbican-79e6-account-create-update-jqqxt\" (UID: \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\") " pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.307210 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9abb-account-create-update-z2jlj"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.336711 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-79e6-account-create-update-vkr56"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.353823 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.356994 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerName="openstack-network-exporter" containerID="cri-o://7e44b2e0f1ce0f0062f29542f63b0c0364c6a86812c315179e49f26887d11a6d" gracePeriod=300 Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.372515 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.390198 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7csbz\" (UniqueName: \"kubernetes.io/projected/1d951b25-e886-44c9-b7f7-d60853e1e0a9-kube-api-access-7csbz\") pod \"barbican-79e6-account-create-update-jqqxt\" (UID: \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\") " pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.397445 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb8b8df-bedb-4e35-b709-25e83be00470-operator-scripts\") pod \"neutron-9abb-account-create-update-z2jlj\" (UID: \"efb8b8df-bedb-4e35-b709-25e83be00470\") " pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.397505 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrpss\" (UniqueName: \"kubernetes.io/projected/efb8b8df-bedb-4e35-b709-25e83be00470-kube-api-access-vrpss\") pod \"neutron-9abb-account-create-update-z2jlj\" (UID: \"efb8b8df-bedb-4e35-b709-25e83be00470\") " pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.398345 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb8b8df-bedb-4e35-b709-25e83be00470-operator-scripts\") pod \"neutron-9abb-account-create-update-z2jlj\" (UID: \"efb8b8df-bedb-4e35-b709-25e83be00470\") " pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.413162 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.413448 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerName="ovn-northd" containerID="cri-o://a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac" gracePeriod=30 Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.413861 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerName="openstack-network-exporter" containerID="cri-o://c17a1b2d5a41cb5fcdc52c5477e557d5c46788d343c6e07f1cda8f8d094698b3" gracePeriod=30 Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.424668 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.434497 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrpss\" (UniqueName: \"kubernetes.io/projected/efb8b8df-bedb-4e35-b709-25e83be00470-kube-api-access-vrpss\") pod \"neutron-9abb-account-create-update-z2jlj\" (UID: \"efb8b8df-bedb-4e35-b709-25e83be00470\") " pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.450779 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-79e6-account-create-update-vkr56"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.477441 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:27:19 crc kubenswrapper[5008]: E0318 18:27:19.500443 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:19 crc kubenswrapper[5008]: E0318 18:27:19.500899 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data podName:b60d757b-db66-46c1-ad92-4a9e591217a0 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:20.000880659 +0000 UTC m=+1496.520353738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0") : configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.538859 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9abb-account-create-update-sp975"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.553059 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9abb-account-create-update-sp975"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.561622 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3502-account-create-update-qzqhp"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.572419 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3502-account-create-update-qzqhp"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.583707 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.588702 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3502-account-create-update-m52l9"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.607650 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8x6nb"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.621917 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.622032 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3502-account-create-update-m52l9"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.622761 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.633263 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerName="ovsdbserver-sb" containerID="cri-o://7105fd9adfc4911e01e2a18a48dd35e4e9e7daabc38c06b0e726445c47171d4a" gracePeriod=300 Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.648723 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-efa2-account-create-update-pwnw8"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.649872 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.681098 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8x6nb"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.681809 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.714729 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-efa2-account-create-update-pwnw8"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.737438 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de40aba-2f95-4c75-8875-8ffbf5f17898-operator-scripts\") pod \"nova-api-efa2-account-create-update-pwnw8\" (UID: \"5de40aba-2f95-4c75-8875-8ffbf5f17898\") " pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.738217 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rb8q\" (UniqueName: \"kubernetes.io/projected/5de40aba-2f95-4c75-8875-8ffbf5f17898-kube-api-access-6rb8q\") pod \"nova-api-efa2-account-create-update-pwnw8\" (UID: \"5de40aba-2f95-4c75-8875-8ffbf5f17898\") " pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.738379 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxmv7\" (UniqueName: \"kubernetes.io/projected/8026d1a2-1e3a-4930-9424-56565551f4bb-kube-api-access-nxmv7\") pod \"cinder-3502-account-create-update-qzqhp\" (UID: \"8026d1a2-1e3a-4930-9424-56565551f4bb\") " pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.738990 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8026d1a2-1e3a-4930-9424-56565551f4bb-operator-scripts\") pod \"cinder-3502-account-create-update-qzqhp\" (UID: \"8026d1a2-1e3a-4930-9424-56565551f4bb\") " pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.775231 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-z57k6"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.798364 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.815233 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.816004 5008 generic.go:334] "Generic (PLEG): container finished" podID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerID="7e44b2e0f1ce0f0062f29542f63b0c0364c6a86812c315179e49f26887d11a6d" exitCode=2 Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.816159 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jqd2z"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.816184 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4edb3df2-7960-412a-ba0f-32bd8fdabc86","Type":"ContainerDied","Data":"7e44b2e0f1ce0f0062f29542f63b0c0364c6a86812c315179e49f26887d11a6d"} Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.828248 5008 generic.go:334] "Generic (PLEG): container finished" podID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerID="c17a1b2d5a41cb5fcdc52c5477e557d5c46788d343c6e07f1cda8f8d094698b3" exitCode=2 Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.828294 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7f1c2fc8-83c6-4183-ac62-f23ad5db8610","Type":"ContainerDied","Data":"c17a1b2d5a41cb5fcdc52c5477e557d5c46788d343c6e07f1cda8f8d094698b3"} Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.855602 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8026d1a2-1e3a-4930-9424-56565551f4bb-operator-scripts\") pod \"cinder-3502-account-create-update-qzqhp\" (UID: \"8026d1a2-1e3a-4930-9424-56565551f4bb\") " pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.855716 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de40aba-2f95-4c75-8875-8ffbf5f17898-operator-scripts\") pod \"nova-api-efa2-account-create-update-pwnw8\" (UID: \"5de40aba-2f95-4c75-8875-8ffbf5f17898\") " pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.855740 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rb8q\" (UniqueName: \"kubernetes.io/projected/5de40aba-2f95-4c75-8875-8ffbf5f17898-kube-api-access-6rb8q\") pod \"nova-api-efa2-account-create-update-pwnw8\" (UID: \"5de40aba-2f95-4c75-8875-8ffbf5f17898\") " pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.855765 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b1de51-7913-41fc-afd9-b1f901532d03-operator-scripts\") pod \"nova-cell0-2f63-account-create-update-z57k6\" (UID: \"a5b1de51-7913-41fc-afd9-b1f901532d03\") " pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.855788 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxmv7\" (UniqueName: \"kubernetes.io/projected/8026d1a2-1e3a-4930-9424-56565551f4bb-kube-api-access-nxmv7\") pod \"cinder-3502-account-create-update-qzqhp\" (UID: \"8026d1a2-1e3a-4930-9424-56565551f4bb\") " pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.855873 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbgwt\" (UniqueName: \"kubernetes.io/projected/a5b1de51-7913-41fc-afd9-b1f901532d03-kube-api-access-rbgwt\") pod \"nova-cell0-2f63-account-create-update-z57k6\" (UID: \"a5b1de51-7913-41fc-afd9-b1f901532d03\") " pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.857287 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de40aba-2f95-4c75-8875-8ffbf5f17898-operator-scripts\") pod \"nova-api-efa2-account-create-update-pwnw8\" (UID: \"5de40aba-2f95-4c75-8875-8ffbf5f17898\") " pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.863078 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8026d1a2-1e3a-4930-9424-56565551f4bb-operator-scripts\") pod \"cinder-3502-account-create-update-qzqhp\" (UID: \"8026d1a2-1e3a-4930-9424-56565551f4bb\") " pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.873617 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-z57k6"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.915369 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rb8q\" (UniqueName: \"kubernetes.io/projected/5de40aba-2f95-4c75-8875-8ffbf5f17898-kube-api-access-6rb8q\") pod \"nova-api-efa2-account-create-update-pwnw8\" (UID: \"5de40aba-2f95-4c75-8875-8ffbf5f17898\") " pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.937828 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxmv7\" (UniqueName: \"kubernetes.io/projected/8026d1a2-1e3a-4930-9424-56565551f4bb-kube-api-access-nxmv7\") pod \"cinder-3502-account-create-update-qzqhp\" (UID: \"8026d1a2-1e3a-4930-9424-56565551f4bb\") " pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.944164 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jqd2z"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.957311 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b1de51-7913-41fc-afd9-b1f901532d03-operator-scripts\") pod \"nova-cell0-2f63-account-create-update-z57k6\" (UID: \"a5b1de51-7913-41fc-afd9-b1f901532d03\") " pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.957391 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbgwt\" (UniqueName: \"kubernetes.io/projected/a5b1de51-7913-41fc-afd9-b1f901532d03-kube-api-access-rbgwt\") pod \"nova-cell0-2f63-account-create-update-z57k6\" (UID: \"a5b1de51-7913-41fc-afd9-b1f901532d03\") " pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.963604 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b1de51-7913-41fc-afd9-b1f901532d03-operator-scripts\") pod \"nova-cell0-2f63-account-create-update-z57k6\" (UID: \"a5b1de51-7913-41fc-afd9-b1f901532d03\") " pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.976639 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-z85w5"] Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.977752 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:19 crc kubenswrapper[5008]: I0318 18:27:19.989817 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.012325 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbgwt\" (UniqueName: \"kubernetes.io/projected/a5b1de51-7913-41fc-afd9-b1f901532d03-kube-api-access-rbgwt\") pod \"nova-cell0-2f63-account-create-update-z57k6\" (UID: \"a5b1de51-7913-41fc-afd9-b1f901532d03\") " pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.043487 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-z85w5"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.190329 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:20 crc kubenswrapper[5008]: E0318 18:27:20.191953 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:20 crc kubenswrapper[5008]: E0318 18:27:20.192106 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data podName:b60d757b-db66-46c1-ad92-4a9e591217a0 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:21.192059427 +0000 UTC m=+1497.711532506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0") : configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.193263 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.207319 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.295116 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8wpf\" (UniqueName: \"kubernetes.io/projected/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-kube-api-access-v8wpf\") pod \"nova-cell1-ae04-account-create-update-z85w5\" (UID: \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\") " pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.295666 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-operator-scripts\") pod \"nova-cell1-ae04-account-create-update-z85w5\" (UID: \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\") " pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.400050 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-operator-scripts\") pod \"nova-cell1-ae04-account-create-update-z85w5\" (UID: \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\") " pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.400293 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8wpf\" (UniqueName: \"kubernetes.io/projected/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-kube-api-access-v8wpf\") pod \"nova-cell1-ae04-account-create-update-z85w5\" (UID: \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\") " pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.401319 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-operator-scripts\") pod \"nova-cell1-ae04-account-create-update-z85w5\" (UID: \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\") " pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.402961 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1558ccb4-f7d0-4b6d-a458-b13cb927f6b3" path="/var/lib/kubelet/pods/1558ccb4-f7d0-4b6d-a458-b13cb927f6b3/volumes" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.403665 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8221015a-61ac-474e-97ea-be2c233e4139" path="/var/lib/kubelet/pods/8221015a-61ac-474e-97ea-be2c233e4139/volumes" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.404209 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82798150-dec0-4bf6-a917-afe9e9ad020d" path="/var/lib/kubelet/pods/82798150-dec0-4bf6-a917-afe9e9ad020d/volumes" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.406146 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906dac4b-4209-4b3b-b934-6804508c028b" path="/var/lib/kubelet/pods/906dac4b-4209-4b3b-b934-6804508c028b/volumes" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.406769 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3fb1a3-21e9-4a67-8c15-e0e12424a5df" path="/var/lib/kubelet/pods/ac3fb1a3-21e9-4a67-8c15-e0e12424a5df/volumes" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.409090 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bef0b8-65da-409d-967a-5b49a28835d3" path="/var/lib/kubelet/pods/f6bef0b8-65da-409d-967a-5b49a28835d3/volumes" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.409696 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-887vw"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.409728 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-efa2-account-create-update-xrxs7"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.427311 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-887vw"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.443017 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-efa2-account-create-update-xrxs7"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.463260 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8wpf\" (UniqueName: \"kubernetes.io/projected/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-kube-api-access-v8wpf\") pod \"nova-cell1-ae04-account-create-update-z85w5\" (UID: \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\") " pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.524062 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-fw58c"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.570696 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-fw58c"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.598096 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jgqxf"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.612296 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jgqxf"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.622646 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-hpg5p"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.644212 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.652402 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.664028 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-hpg5p"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.682117 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jbx2h"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.710747 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9qcqj"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.747464 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jbx2h"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.778299 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-x8pkm"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.802472 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-78xsw"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.802711 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-78xsw" podUID="a8857503-cb26-46f0-b4a3-e931a9e3f1ed" containerName="openstack-network-exporter" containerID="cri-o://82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae" gracePeriod=30 Mar 18 18:27:20 crc kubenswrapper[5008]: E0318 18:27:20.811638 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:27:20 crc kubenswrapper[5008]: E0318 18:27:20.811709 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data podName:3d5f0191-2702-46ed-ab82-e8c93ec1cf02 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:21.311688176 +0000 UTC m=+1497.831161255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data") pod "rabbitmq-server-0" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02") : configmap "rabbitmq-config-data" not found Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.822952 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5b86568468-vhc29"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.823265 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5b86568468-vhc29" podUID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerName="placement-log" containerID="cri-o://41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8" gracePeriod=30 Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.823715 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5b86568468-vhc29" podUID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerName="placement-api" containerID="cri-o://eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa" gracePeriod=30 Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.901362 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4edb3df2-7960-412a-ba0f-32bd8fdabc86/ovsdbserver-sb/0.log" Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.901409 5008 generic.go:334] "Generic (PLEG): container finished" podID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerID="7105fd9adfc4911e01e2a18a48dd35e4e9e7daabc38c06b0e726445c47171d4a" exitCode=143 Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.901436 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4edb3df2-7960-412a-ba0f-32bd8fdabc86","Type":"ContainerDied","Data":"7105fd9adfc4911e01e2a18a48dd35e4e9e7daabc38c06b0e726445c47171d4a"} Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.914741 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.915395 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerName="openstack-network-exporter" containerID="cri-o://5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0" gracePeriod=300 Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.933795 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.934063 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerName="glance-log" containerID="cri-o://4efce2fc93ac7a338b3af0f031e3448a62c3f819288460beeb882dd6abd7cbe9" gracePeriod=30 Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.934202 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerName="glance-httpd" containerID="cri-o://7c0883124c7538aea54980f006ad5cde12fc4e637570de29a2ca5d0d49c482e5" gracePeriod=30 Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.945102 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-52mqb"] Mar 18 18:27:20 crc kubenswrapper[5008]: I0318 18:27:20.985592 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-52mqb"] Mar 18 18:27:20 crc kubenswrapper[5008]: E0318 18:27:20.988670 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:20 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:20 crc kubenswrapper[5008]: Mar 18 18:27:20 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:20 crc kubenswrapper[5008]: Mar 18 18:27:20 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:20 crc kubenswrapper[5008]: Mar 18 18:27:20 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:20 crc kubenswrapper[5008]: Mar 18 18:27:20 crc kubenswrapper[5008]: if [ -n "barbican" ]; then Mar 18 18:27:20 crc kubenswrapper[5008]: GRANT_DATABASE="barbican" Mar 18 18:27:20 crc kubenswrapper[5008]: else Mar 18 18:27:20 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:20 crc kubenswrapper[5008]: fi Mar 18 18:27:20 crc kubenswrapper[5008]: Mar 18 18:27:20 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:20 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:20 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:20 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:20 crc kubenswrapper[5008]: # support updates Mar 18 18:27:20 crc kubenswrapper[5008]: Mar 18 18:27:20 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:20 crc kubenswrapper[5008]: E0318 18:27:20.990050 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-79e6-account-create-update-jqqxt" podUID="1d951b25-e886-44c9-b7f7-d60853e1e0a9" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.009295 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-g9x6h"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.054746 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-g9x6h"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.055249 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerName="ovsdbserver-nb" containerID="cri-o://afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54" gracePeriod=300 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.082634 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-gfwz9"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.082957 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" podUID="68b393c9-78fb-4bde-930d-6af4b840f9e3" containerName="dnsmasq-dns" containerID="cri-o://22db4948be656e49590d67a4b03650ce23c340a3e654f974e8ccf95e3e514659" gracePeriod=10 Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.152113 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54 is running failed: container process not found" containerID="afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.152235 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.152707 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-server" containerID="cri-o://b2ff7dec8963820747dd167a23cc98ef08104fcb6886f42e0c188e8c2d2b5557" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153044 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="swift-recon-cron" containerID="cri-o://16ca62d6ed1f662b6bd0ea0c5af9755fe9a957be9453f4224460900b731f6943" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153085 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="rsync" containerID="cri-o://380cc5591873123d91f18022fd060b0a9e10c5e3b072ae816f61a2e6ad015a78" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153118 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-expirer" containerID="cri-o://ee0fd9858e770e37fee73845e8c0a241a341746edd5488d756144d0dbce6ee7b" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153153 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-updater" containerID="cri-o://e00afaaa564c37f366ee4ac26eb8ca94d2c1e8b26ed42d7509ff378a29f8f96a" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153181 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-auditor" containerID="cri-o://4f571497d968bf39f24266de4994c7de6a2c821baa3ad302407cf536047c662e" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153211 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-replicator" containerID="cri-o://3b7732cd3cbc9f6e46f3b52e181285bcbcb64ff5a7d634bf4399f0d57729ef65" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153284 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-server" containerID="cri-o://0b1ee5c8c45f6646ada20310701b8ec3f99b2a8128a2190acf71a6ef29f4200a" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153333 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-updater" containerID="cri-o://2a06525b664dc560a781b00430903d7869796e656f728e7637b34cc39532a99e" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153369 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-auditor" containerID="cri-o://aa11f18a13f730403dae487c0f3224a3b6d6266ab6e0fc1aab36fa0cff77ecb4" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153410 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-replicator" containerID="cri-o://21b72f72c110b5cddd921bb4d2588f810988fa4c91525dd72e97c92a5f5d881d" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153465 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-server" containerID="cri-o://a3d478398fbcc00ebce85e7d90128952489a28ad02808dcc006fc3822c4fdaba" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153516 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-reaper" containerID="cri-o://b5255bfa8eb99b8162ba17c46557ccb30518ff7df2b8694d473240b663a9ce8c" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153569 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-auditor" containerID="cri-o://688589b78817d925eba18cc083d7aae7884af996d5eac87b2f9b8be694e1d743" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.153634 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-replicator" containerID="cri-o://b75695d9f9722a67c19ee08c21555a403a91fc1e836d2a6b7c94c581c39bc7e8" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.153533 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54 is running failed: container process not found" containerID="afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.166767 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54 is running failed: container process not found" containerID="afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.166831 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerName="ovsdbserver-nb" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.177718 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-klvjh"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.180416 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-klvjh"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.206354 5008 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-novncproxy-0" secret="" err="secret \"nova-nova-dockercfg-24rgk\" not found" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.223234 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.223476 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerName="glance-log" containerID="cri-o://48672c4f4a417aeb7d70b46843dbaf3f5264f47434917232456154f0d644258b" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.223630 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerName="glance-httpd" containerID="cri-o://1bbd2b9a3501779f9dfc17cd725e0dca6af96fcee035c33d265e2278978c5d37" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.242448 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-835c-account-create-update-6znqk"] Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.271691 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.271767 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data podName:b60d757b-db66-46c1-ad92-4a9e591217a0 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:23.271745045 +0000 UTC m=+1499.791218124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0") : configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:21 crc kubenswrapper[5008]: W0318 18:27:21.343148 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb8b8df_bedb_4e35_b709_25e83be00470.slice/crio-127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538 WatchSource:0}: Error finding container 127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538: Status 404 returned error can't find the container with id 127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538 Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.414601 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.414671 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data podName:3d5f0191-2702-46ed-ab82-e8c93ec1cf02 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:22.414654935 +0000 UTC m=+1498.934128004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data") pod "rabbitmq-server-0" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02") : configmap "rabbitmq-config-data" not found Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.415076 5008 secret.go:188] Couldn't get secret openstack/nova-cell1-novncproxy-config-data: secret "nova-cell1-novncproxy-config-data" not found Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.415116 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data podName:7ab5f625-144a-4c7c-bab8-5399de3b5a8e nodeName:}" failed. No retries permitted until 2026-03-18 18:27:21.915103887 +0000 UTC m=+1498.434576966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data") pod "nova-cell1-novncproxy-0" (UID: "7ab5f625-144a-4c7c-bab8-5399de3b5a8e") : secret "nova-cell1-novncproxy-config-data" not found Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.468189 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-835c-account-create-update-6znqk"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.502142 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-79e6-account-create-update-jqqxt"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.504767 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4edb3df2-7960-412a-ba0f-32bd8fdabc86/ovsdbserver-sb/0.log" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.504839 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.505243 5008 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 18 18:27:21 crc kubenswrapper[5008]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 18:27:21 crc kubenswrapper[5008]: + source /usr/local/bin/container-scripts/functions Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNBridge=br-int Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNRemote=tcp:localhost:6642 Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNEncapType=geneve Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNAvailabilityZones= Mar 18 18:27:21 crc kubenswrapper[5008]: ++ EnableChassisAsGateway=true Mar 18 18:27:21 crc kubenswrapper[5008]: ++ PhysicalNetworks= Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNHostName= Mar 18 18:27:21 crc kubenswrapper[5008]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 18:27:21 crc kubenswrapper[5008]: ++ ovs_dir=/var/lib/openvswitch Mar 18 18:27:21 crc kubenswrapper[5008]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 18:27:21 crc kubenswrapper[5008]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 18:27:21 crc kubenswrapper[5008]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:27:21 crc kubenswrapper[5008]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:27:21 crc kubenswrapper[5008]: + sleep 0.5 Mar 18 18:27:21 crc kubenswrapper[5008]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:27:21 crc kubenswrapper[5008]: + cleanup_ovsdb_server_semaphore Mar 18 18:27:21 crc kubenswrapper[5008]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:27:21 crc kubenswrapper[5008]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 18:27:21 crc kubenswrapper[5008]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-x8pkm" message=< Mar 18 18:27:21 crc kubenswrapper[5008]: Exiting ovsdb-server (5) [ OK ] Mar 18 18:27:21 crc kubenswrapper[5008]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 18:27:21 crc kubenswrapper[5008]: + source /usr/local/bin/container-scripts/functions Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNBridge=br-int Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNRemote=tcp:localhost:6642 Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNEncapType=geneve Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNAvailabilityZones= Mar 18 18:27:21 crc kubenswrapper[5008]: ++ EnableChassisAsGateway=true Mar 18 18:27:21 crc kubenswrapper[5008]: ++ PhysicalNetworks= Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNHostName= Mar 18 18:27:21 crc kubenswrapper[5008]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 18:27:21 crc kubenswrapper[5008]: ++ ovs_dir=/var/lib/openvswitch Mar 18 18:27:21 crc kubenswrapper[5008]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 18:27:21 crc kubenswrapper[5008]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 18:27:21 crc kubenswrapper[5008]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:27:21 crc kubenswrapper[5008]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:27:21 crc kubenswrapper[5008]: + sleep 0.5 Mar 18 18:27:21 crc kubenswrapper[5008]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:27:21 crc kubenswrapper[5008]: + cleanup_ovsdb_server_semaphore Mar 18 18:27:21 crc kubenswrapper[5008]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:27:21 crc kubenswrapper[5008]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 18:27:21 crc kubenswrapper[5008]: > Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.505272 5008 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 18 18:27:21 crc kubenswrapper[5008]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 18 18:27:21 crc kubenswrapper[5008]: + source /usr/local/bin/container-scripts/functions Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNBridge=br-int Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNRemote=tcp:localhost:6642 Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNEncapType=geneve Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNAvailabilityZones= Mar 18 18:27:21 crc kubenswrapper[5008]: ++ EnableChassisAsGateway=true Mar 18 18:27:21 crc kubenswrapper[5008]: ++ PhysicalNetworks= Mar 18 18:27:21 crc kubenswrapper[5008]: ++ OVNHostName= Mar 18 18:27:21 crc kubenswrapper[5008]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 18 18:27:21 crc kubenswrapper[5008]: ++ ovs_dir=/var/lib/openvswitch Mar 18 18:27:21 crc kubenswrapper[5008]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 18 18:27:21 crc kubenswrapper[5008]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 18 18:27:21 crc kubenswrapper[5008]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:27:21 crc kubenswrapper[5008]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:27:21 crc kubenswrapper[5008]: + sleep 0.5 Mar 18 18:27:21 crc kubenswrapper[5008]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 18 18:27:21 crc kubenswrapper[5008]: + cleanup_ovsdb_server_semaphore Mar 18 18:27:21 crc kubenswrapper[5008]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 18 18:27:21 crc kubenswrapper[5008]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 18 18:27:21 crc kubenswrapper[5008]: > pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" containerID="cri-o://240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.505302 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" containerID="cri-o://240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.545024 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:21 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: if [ -n "neutron" ]; then Mar 18 18:27:21 crc kubenswrapper[5008]: GRANT_DATABASE="neutron" Mar 18 18:27:21 crc kubenswrapper[5008]: else Mar 18 18:27:21 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:21 crc kubenswrapper[5008]: fi Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:21 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:21 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:21 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:21 crc kubenswrapper[5008]: # support updates Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.547568 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-9abb-account-create-update-z2jlj" podUID="efb8b8df-bedb-4e35-b709-25e83be00470" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.547964 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bltdw\" (UniqueName: \"kubernetes.io/projected/4edb3df2-7960-412a-ba0f-32bd8fdabc86-kube-api-access-bltdw\") pod \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.551159 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-combined-ca-bundle\") pod \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.551574 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdbserver-sb-tls-certs\") pod \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.551756 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdb-rundir\") pod \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.551917 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.552040 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-scripts\") pod \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.552185 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-metrics-certs-tls-certs\") pod \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.552253 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "4edb3df2-7960-412a-ba0f-32bd8fdabc86" (UID: "4edb3df2-7960-412a-ba0f-32bd8fdabc86"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.552676 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-scripts" (OuterVolumeSpecName: "scripts") pod "4edb3df2-7960-412a-ba0f-32bd8fdabc86" (UID: "4edb3df2-7960-412a-ba0f-32bd8fdabc86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.554669 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:21 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: if [ -n "" ]; then Mar 18 18:27:21 crc kubenswrapper[5008]: GRANT_DATABASE="" Mar 18 18:27:21 crc kubenswrapper[5008]: else Mar 18 18:27:21 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:21 crc kubenswrapper[5008]: fi Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:21 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:21 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:21 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:21 crc kubenswrapper[5008]: # support updates Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.556030 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-j5dtq" podUID="8724ccad-851e-4efc-ad3c-d34252a3f29f" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.559862 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-f7btr"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.575229 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85f9f77dc-mg4p7"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.575499 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85f9f77dc-mg4p7" podUID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerName="neutron-api" containerID="cri-o://37ee661f7953b8d9a32a6d1f71d0668b96eefbaeef42d315f3b9741a68892653" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.575972 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-85f9f77dc-mg4p7" podUID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerName="neutron-httpd" containerID="cri-o://c22d63804d2fa8eaa1661c6774af139f50fcab700ab10feba4318bb64f3859aa" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.577754 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4edb3df2-7960-412a-ba0f-32bd8fdabc86-kube-api-access-bltdw" (OuterVolumeSpecName: "kube-api-access-bltdw") pod "4edb3df2-7960-412a-ba0f-32bd8fdabc86" (UID: "4edb3df2-7960-412a-ba0f-32bd8fdabc86"). InnerVolumeSpecName "kube-api-access-bltdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.599443 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "4edb3df2-7960-412a-ba0f-32bd8fdabc86" (UID: "4edb3df2-7960-412a-ba0f-32bd8fdabc86"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.619720 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-f7btr"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.638616 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-klfwj"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.648638 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.649797 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" containerID="cri-o://90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.649924 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-klfwj"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.654073 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config-secret\") pod \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.654106 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config\") pod \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.654197 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-combined-ca-bundle\") pod \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.654266 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-config\") pod \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\" (UID: \"4edb3df2-7960-412a-ba0f-32bd8fdabc86\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.654325 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgtl8\" (UniqueName: \"kubernetes.io/projected/1e0cfb4d-8438-45bc-882c-27c9544b40a5-kube-api-access-xgtl8\") pod \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\" (UID: \"1e0cfb4d-8438-45bc-882c-27c9544b40a5\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.654784 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bltdw\" (UniqueName: \"kubernetes.io/projected/4edb3df2-7960-412a-ba0f-32bd8fdabc86-kube-api-access-bltdw\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.654799 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.654817 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.654826 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.655113 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-config" (OuterVolumeSpecName: "config") pod "4edb3df2-7960-412a-ba0f-32bd8fdabc86" (UID: "4edb3df2-7960-412a-ba0f-32bd8fdabc86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.656159 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-78xsw_a8857503-cb26-46f0-b4a3-e931a9e3f1ed/openstack-network-exporter/0.log" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.656263 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.658979 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-34da-account-create-update-m48lt"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.668812 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-34da-account-create-update-m48lt"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.671403 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e0cfb4d-8438-45bc-882c-27c9544b40a5-kube-api-access-xgtl8" (OuterVolumeSpecName: "kube-api-access-xgtl8") pod "1e0cfb4d-8438-45bc-882c-27c9544b40a5" (UID: "1e0cfb4d-8438-45bc-882c-27c9544b40a5"). InnerVolumeSpecName "kube-api-access-xgtl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.673056 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.677463 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.677781 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerName="cinder-scheduler" containerID="cri-o://20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.677889 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerName="probe" containerID="cri-o://2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.685452 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-lllsm"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.685987 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e0cfb4d-8438-45bc-882c-27c9544b40a5" (UID: "1e0cfb4d-8438-45bc-882c-27c9544b40a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.692622 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-lllsm"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.703025 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1e0cfb4d-8438-45bc-882c-27c9544b40a5" (UID: "1e0cfb4d-8438-45bc-882c-27c9544b40a5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.707730 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9abb-account-create-update-z2jlj"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.717213 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.717502 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" containerName="cinder-api-log" containerID="cri-o://f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.718940 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" containerName="cinder-api" containerID="cri-o://890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.725106 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wrb54"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.736714 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4edb3df2-7960-412a-ba0f-32bd8fdabc86" (UID: "4edb3df2-7960-412a-ba0f-32bd8fdabc86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.748207 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wrb54"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.755809 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfhkp\" (UniqueName: \"kubernetes.io/projected/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-kube-api-access-sfhkp\") pod \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.756045 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovs-rundir\") pod \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.756101 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-metrics-certs-tls-certs\") pod \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.756126 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-combined-ca-bundle\") pod \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.756182 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovn-rundir\") pod \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.756227 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-config\") pod \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\" (UID: \"a8857503-cb26-46f0-b4a3-e931a9e3f1ed\") " Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.756453 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "a8857503-cb26-46f0-b4a3-e931a9e3f1ed" (UID: "a8857503-cb26-46f0-b4a3-e931a9e3f1ed"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.757035 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.757063 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.757073 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edb3df2-7960-412a-ba0f-32bd8fdabc86-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.757083 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgtl8\" (UniqueName: \"kubernetes.io/projected/1e0cfb4d-8438-45bc-882c-27c9544b40a5-kube-api-access-xgtl8\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.757093 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.757103 5008 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.757111 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.757455 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "a8857503-cb26-46f0-b4a3-e931a9e3f1ed" (UID: "a8857503-cb26-46f0-b4a3-e931a9e3f1ed"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.757455 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-config" (OuterVolumeSpecName: "config") pod "a8857503-cb26-46f0-b4a3-e931a9e3f1ed" (UID: "a8857503-cb26-46f0-b4a3-e931a9e3f1ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.760744 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-79e6-account-create-update-jqqxt"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.769269 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1e0cfb4d-8438-45bc-882c-27c9544b40a5" (UID: "1e0cfb4d-8438-45bc-882c-27c9544b40a5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.769579 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-968pz"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.777241 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-968pz"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.783129 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3502-account-create-update-qzqhp"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.790327 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-kube-api-access-sfhkp" (OuterVolumeSpecName: "kube-api-access-sfhkp") pod "a8857503-cb26-46f0-b4a3-e931a9e3f1ed" (UID: "a8857503-cb26-46f0-b4a3-e931a9e3f1ed"). InnerVolumeSpecName "kube-api-access-sfhkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.790491 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xtddx"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.797762 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xtddx"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.811915 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.819916 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-efa2-account-create-update-pwnw8"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.829502 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.829873 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-log" containerID="cri-o://ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.830042 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-metadata" containerID="cri-o://2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.834119 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "4edb3df2-7960-412a-ba0f-32bd8fdabc86" (UID: "4edb3df2-7960-412a-ba0f-32bd8fdabc86"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.846715 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jqltp"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.861115 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.861143 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfhkp\" (UniqueName: \"kubernetes.io/projected/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-kube-api-access-sfhkp\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.861155 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.861164 5008 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e0cfb4d-8438-45bc-882c-27c9544b40a5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.861173 5008 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.868841 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jqltp"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.896333 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7cc79d78dc-6z4kh"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.896589 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" podUID="e3448bfb-b04e-4f59-b275-45ae07178640" containerName="proxy-httpd" containerID="cri-o://868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.897088 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" podUID="e3448bfb-b04e-4f59-b275-45ae07178640" containerName="proxy-server" containerID="cri-o://e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781" gracePeriod=30 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.898158 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "4edb3df2-7960-412a-ba0f-32bd8fdabc86" (UID: "4edb3df2-7960-412a-ba0f-32bd8fdabc86"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.911702 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8857503-cb26-46f0-b4a3-e931a9e3f1ed" (UID: "a8857503-cb26-46f0-b4a3-e931a9e3f1ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.913774 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-z57k6"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.944566 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9abb-account-create-update-z2jlj"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.951301 5008 generic.go:334] "Generic (PLEG): container finished" podID="68b393c9-78fb-4bde-930d-6af4b840f9e3" containerID="22db4948be656e49590d67a4b03650ce23c340a3e654f974e8ccf95e3e514659" exitCode=0 Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.951486 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" event={"ID":"68b393c9-78fb-4bde-930d-6af4b840f9e3","Type":"ContainerDied","Data":"22db4948be656e49590d67a4b03650ce23c340a3e654f974e8ccf95e3e514659"} Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.954233 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j5dtq" event={"ID":"8724ccad-851e-4efc-ad3c-d34252a3f29f","Type":"ContainerStarted","Data":"f05d286e2f3388afd6a09ef5abb6b12bf7d24c7c48c4d7b66890cc0933acae27"} Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.955415 5008 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-j5dtq" secret="" err="secret \"galera-openstack-cell1-dockercfg-dkzfs\" not found" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.964112 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb3df2-7960-412a-ba0f-32bd8fdabc86-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.964141 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.964991 5008 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.965113 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts podName:8724ccad-851e-4efc-ad3c-d34252a3f29f nodeName:}" failed. No retries permitted until 2026-03-18 18:27:22.465095031 +0000 UTC m=+1498.984568100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts") pod "root-account-create-update-j5dtq" (UID: "8724ccad-851e-4efc-ad3c-d34252a3f29f") : configmap "openstack-cell1-scripts" not found Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.965259 5008 secret.go:188] Couldn't get secret openstack/nova-cell1-novncproxy-config-data: secret "nova-cell1-novncproxy-config-data" not found Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.965344 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data podName:7ab5f625-144a-4c7c-bab8-5399de3b5a8e nodeName:}" failed. No retries permitted until 2026-03-18 18:27:22.965332877 +0000 UTC m=+1499.484805956 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data") pod "nova-cell1-novncproxy-0" (UID: "7ab5f625-144a-4c7c-bab8-5399de3b5a8e") : secret "nova-cell1-novncproxy-config-data" not found Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.967067 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kgmwv"] Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.974009 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:21 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: if [ -n "" ]; then Mar 18 18:27:21 crc kubenswrapper[5008]: GRANT_DATABASE="" Mar 18 18:27:21 crc kubenswrapper[5008]: else Mar 18 18:27:21 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:21 crc kubenswrapper[5008]: fi Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:21 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:21 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:21 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:21 crc kubenswrapper[5008]: # support updates Mar 18 18:27:21 crc kubenswrapper[5008]: Mar 18 18:27:21 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:21 crc kubenswrapper[5008]: E0318 18:27:21.976886 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-j5dtq" podUID="8724ccad-851e-4efc-ad3c-d34252a3f29f" Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.982999 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kgmwv"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.991188 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:27:21 crc kubenswrapper[5008]: I0318 18:27:21.994882 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "a8857503-cb26-46f0-b4a3-e931a9e3f1ed" (UID: "a8857503-cb26-46f0-b4a3-e931a9e3f1ed"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.007000 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j5dtq"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.015551 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.015861 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"380cc5591873123d91f18022fd060b0a9e10c5e3b072ae816f61a2e6ad015a78"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.015717 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="380cc5591873123d91f18022fd060b0a9e10c5e3b072ae816f61a2e6ad015a78" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.016152 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="ee0fd9858e770e37fee73845e8c0a241a341746edd5488d756144d0dbce6ee7b" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.016231 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="e00afaaa564c37f366ee4ac26eb8ca94d2c1e8b26ed42d7509ff378a29f8f96a" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.016290 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="4f571497d968bf39f24266de4994c7de6a2c821baa3ad302407cf536047c662e" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.016341 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="3b7732cd3cbc9f6e46f3b52e181285bcbcb64ff5a7d634bf4399f0d57729ef65" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.016396 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="0b1ee5c8c45f6646ada20310701b8ec3f99b2a8128a2190acf71a6ef29f4200a" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.018414 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="2a06525b664dc560a781b00430903d7869796e656f728e7637b34cc39532a99e" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.018495 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="aa11f18a13f730403dae487c0f3224a3b6d6266ab6e0fc1aab36fa0cff77ecb4" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.018577 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="21b72f72c110b5cddd921bb4d2588f810988fa4c91525dd72e97c92a5f5d881d" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.018646 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="a3d478398fbcc00ebce85e7d90128952489a28ad02808dcc006fc3822c4fdaba" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.018709 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="b5255bfa8eb99b8162ba17c46557ccb30518ff7df2b8694d473240b663a9ce8c" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.018782 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="688589b78817d925eba18cc083d7aae7884af996d5eac87b2f9b8be694e1d743" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.018847 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="b75695d9f9722a67c19ee08c21555a403a91fc1e836d2a6b7c94c581c39bc7e8" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.018910 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="b2ff7dec8963820747dd167a23cc98ef08104fcb6886f42e0c188e8c2d2b5557" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.017148 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-api" containerID="cri-o://4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.016770 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"ee0fd9858e770e37fee73845e8c0a241a341746edd5488d756144d0dbce6ee7b"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.019248 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"e00afaaa564c37f366ee4ac26eb8ca94d2c1e8b26ed42d7509ff378a29f8f96a"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.019317 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"4f571497d968bf39f24266de4994c7de6a2c821baa3ad302407cf536047c662e"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.019375 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"3b7732cd3cbc9f6e46f3b52e181285bcbcb64ff5a7d634bf4399f0d57729ef65"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.019435 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"0b1ee5c8c45f6646ada20310701b8ec3f99b2a8128a2190acf71a6ef29f4200a"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.019505 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"2a06525b664dc560a781b00430903d7869796e656f728e7637b34cc39532a99e"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.019596 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"aa11f18a13f730403dae487c0f3224a3b6d6266ab6e0fc1aab36fa0cff77ecb4"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.019933 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"21b72f72c110b5cddd921bb4d2588f810988fa4c91525dd72e97c92a5f5d881d"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.020024 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"a3d478398fbcc00ebce85e7d90128952489a28ad02808dcc006fc3822c4fdaba"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.020105 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"b5255bfa8eb99b8162ba17c46557ccb30518ff7df2b8694d473240b663a9ce8c"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.020179 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"688589b78817d925eba18cc083d7aae7884af996d5eac87b2f9b8be694e1d743"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.020253 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"b75695d9f9722a67c19ee08c21555a403a91fc1e836d2a6b7c94c581c39bc7e8"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.020330 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"b2ff7dec8963820747dd167a23cc98ef08104fcb6886f42e0c188e8c2d2b5557"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.016952 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-log" containerID="cri-o://b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.025614 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.026647 5008 generic.go:334] "Generic (PLEG): container finished" podID="d27fb392-40df-45a9-aeae-20781d90f02b" containerID="f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953" exitCode=143 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.026785 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d27fb392-40df-45a9-aeae-20781d90f02b","Type":"ContainerDied","Data":"f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.030175 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77598b888d-8wwqt"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.030479 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77598b888d-8wwqt" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api-log" containerID="cri-o://e741fbe7654b140e43fc28460cedf85595febc32bb65a05c9b3ffb75588298c2" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.030807 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77598b888d-8wwqt" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api" containerID="cri-o://4eadc2692f3a8d44b36203cb5d3923e2c3a539a3f5ba5c413d2a6377e7749375" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.037658 5008 generic.go:334] "Generic (PLEG): container finished" podID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerID="41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8" exitCode=143 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.037795 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b86568468-vhc29" event={"ID":"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b","Type":"ContainerDied","Data":"41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.052594 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-ff487fff5-mqmcg"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.057749 5008 generic.go:334] "Generic (PLEG): container finished" podID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerID="c22d63804d2fa8eaa1661c6774af139f50fcab700ab10feba4318bb64f3859aa" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.057838 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f9f77dc-mg4p7" event={"ID":"0c9299b1-8e15-4e9c-bada-ce88af9c1c28","Type":"ContainerDied","Data":"c22d63804d2fa8eaa1661c6774af139f50fcab700ab10feba4318bb64f3859aa"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.059107 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-ff487fff5-mqmcg" podUID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerName="barbican-worker" containerID="cri-o://7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.059066 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-ff487fff5-mqmcg" podUID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerName="barbican-worker-log" containerID="cri-o://f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.066367 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-config\") pod \"68b393c9-78fb-4bde-930d-6af4b840f9e3\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.066462 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-swift-storage-0\") pod \"68b393c9-78fb-4bde-930d-6af4b840f9e3\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.066500 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-sb\") pod \"68b393c9-78fb-4bde-930d-6af4b840f9e3\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.066580 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-nb\") pod \"68b393c9-78fb-4bde-930d-6af4b840f9e3\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.066605 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrc8c\" (UniqueName: \"kubernetes.io/projected/68b393c9-78fb-4bde-930d-6af4b840f9e3-kube-api-access-rrc8c\") pod \"68b393c9-78fb-4bde-930d-6af4b840f9e3\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.066914 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-svc\") pod \"68b393c9-78fb-4bde-930d-6af4b840f9e3\" (UID: \"68b393c9-78fb-4bde-930d-6af4b840f9e3\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.067298 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8857503-cb26-46f0-b4a3-e931a9e3f1ed-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.097791 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b393c9-78fb-4bde-930d-6af4b840f9e3-kube-api-access-rrc8c" (OuterVolumeSpecName: "kube-api-access-rrc8c") pod "68b393c9-78fb-4bde-930d-6af4b840f9e3" (UID: "68b393c9-78fb-4bde-930d-6af4b840f9e3"). InnerVolumeSpecName "kube-api-access-rrc8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.100473 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:22 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: if [ -n "nova_cell0" ]; then Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="nova_cell0" Mar 18 18:27:22 crc kubenswrapper[5008]: else Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:22 crc kubenswrapper[5008]: fi Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:22 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:22 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:22 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:22 crc kubenswrapper[5008]: # support updates Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.105826 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-2f63-account-create-update-z57k6" podUID="a5b1de51-7913-41fc-afd9-b1f901532d03" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.106420 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b60d757b-db66-46c1-ad92-4a9e591217a0" containerName="rabbitmq" containerID="cri-o://82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411" gracePeriod=604800 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.126546 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_defaf26d-efb3-4ab4-96fb-fe8826988fe1/ovsdbserver-nb/0.log" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.126637 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.146568 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4edb3df2-7960-412a-ba0f-32bd8fdabc86/ovsdbserver-sb/0.log" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.152615 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.153699 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-z85w5"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.153776 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4edb3df2-7960-412a-ba0f-32bd8fdabc86","Type":"ContainerDied","Data":"14f529541100816645fc2e14c39aac9db90e837516c6820a8088594e813b6578"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.153815 5008 scope.go:117] "RemoveContainer" containerID="7e44b2e0f1ce0f0062f29542f63b0c0364c6a86812c315179e49f26887d11a6d" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.162035 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-66b589877b-qzcdx"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.162302 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" podUID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerName="barbican-keystone-listener-log" containerID="cri-o://b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.162432 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" podUID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerName="barbican-keystone-listener" containerID="cri-o://dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.165124 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-78xsw_a8857503-cb26-46f0-b4a3-e931a9e3f1ed/openstack-network-exporter/0.log" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.165160 5008 generic.go:334] "Generic (PLEG): container finished" podID="a8857503-cb26-46f0-b4a3-e931a9e3f1ed" containerID="82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae" exitCode=2 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.165207 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-78xsw" event={"ID":"a8857503-cb26-46f0-b4a3-e931a9e3f1ed","Type":"ContainerDied","Data":"82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.165224 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-78xsw" event={"ID":"a8857503-cb26-46f0-b4a3-e931a9e3f1ed","Type":"ContainerDied","Data":"c457a9b60742c8038f2b523b9bba086cb3dcaa191c37cf8408df50194e8210c2"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.166604 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-78xsw" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.168377 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-metrics-certs-tls-certs\") pod \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.168442 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-scripts\") pod \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.168466 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdb-rundir\") pod \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.168515 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdbserver-nb-tls-certs\") pod \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.168538 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-config\") pod \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.168680 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.168723 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-combined-ca-bundle\") pod \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.168744 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvcqt\" (UniqueName: \"kubernetes.io/projected/defaf26d-efb3-4ab4-96fb-fe8826988fe1-kube-api-access-hvcqt\") pod \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\" (UID: \"defaf26d-efb3-4ab4-96fb-fe8826988fe1\") " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.169203 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrc8c\" (UniqueName: \"kubernetes.io/projected/68b393c9-78fb-4bde-930d-6af4b840f9e3-kube-api-access-rrc8c\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.176197 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-scripts" (OuterVolumeSpecName: "scripts") pod "defaf26d-efb3-4ab4-96fb-fe8826988fe1" (UID: "defaf26d-efb3-4ab4-96fb-fe8826988fe1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.177837 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "defaf26d-efb3-4ab4-96fb-fe8826988fe1" (UID: "defaf26d-efb3-4ab4-96fb-fe8826988fe1"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.179060 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-config" (OuterVolumeSpecName: "config") pod "defaf26d-efb3-4ab4-96fb-fe8826988fe1" (UID: "defaf26d-efb3-4ab4-96fb-fe8826988fe1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.186841 5008 generic.go:334] "Generic (PLEG): container finished" podID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" exitCode=0 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.186981 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x8pkm" event={"ID":"f55031bd-9626-475f-a74f-d0e5f8ec8a66","Type":"ContainerDied","Data":"240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.187693 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defaf26d-efb3-4ab4-96fb-fe8826988fe1-kube-api-access-hvcqt" (OuterVolumeSpecName: "kube-api-access-hvcqt") pod "defaf26d-efb3-4ab4-96fb-fe8826988fe1" (UID: "defaf26d-efb3-4ab4-96fb-fe8826988fe1"). InnerVolumeSpecName "kube-api-access-hvcqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.191756 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:22 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: if [ -n "nova_api" ]; then Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="nova_api" Mar 18 18:27:22 crc kubenswrapper[5008]: else Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:22 crc kubenswrapper[5008]: fi Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:22 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:22 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:22 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:22 crc kubenswrapper[5008]: # support updates Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.192113 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "defaf26d-efb3-4ab4-96fb-fe8826988fe1" (UID: "defaf26d-efb3-4ab4-96fb-fe8826988fe1"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.193718 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-efa2-account-create-update-pwnw8" podUID="5de40aba-2f95-4c75-8875-8ffbf5f17898" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.197078 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9abb-account-create-update-z2jlj" event={"ID":"efb8b8df-bedb-4e35-b709-25e83be00470","Type":"ContainerStarted","Data":"127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.209421 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="07bd6644-ca18-4b8d-ad83-9757257768fb" containerName="galera" containerID="cri-o://4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.217071 5008 generic.go:334] "Generic (PLEG): container finished" podID="1e0cfb4d-8438-45bc-882c-27c9544b40a5" containerID="3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959" exitCode=137 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.217449 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.259701 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "defaf26d-efb3-4ab4-96fb-fe8826988fe1" (UID: "defaf26d-efb3-4ab4-96fb-fe8826988fe1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.267505 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68b393c9-78fb-4bde-930d-6af4b840f9e3" (UID: "68b393c9-78fb-4bde-930d-6af4b840f9e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.273832 5008 scope.go:117] "RemoveContainer" containerID="7105fd9adfc4911e01e2a18a48dd35e4e9e7daabc38c06b0e726445c47171d4a" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.298333 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:22 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: if [ -n "barbican" ]; then Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="barbican" Mar 18 18:27:22 crc kubenswrapper[5008]: else Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:22 crc kubenswrapper[5008]: fi Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:22 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:22 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:22 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:22 crc kubenswrapper[5008]: # support updates Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.299643 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_defaf26d-efb3-4ab4-96fb-fe8826988fe1/ovsdbserver-nb/0.log" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.301747 5008 generic.go:334] "Generic (PLEG): container finished" podID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerID="5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0" exitCode=2 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.302052 5008 generic.go:334] "Generic (PLEG): container finished" podID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerID="afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54" exitCode=143 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.304145 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.304498 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-79e6-account-create-update-jqqxt" podUID="1d951b25-e886-44c9-b7f7-d60853e1e0a9" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.314214 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:22 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: if [ -n "cinder" ]; then Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="cinder" Mar 18 18:27:22 crc kubenswrapper[5008]: else Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:22 crc kubenswrapper[5008]: fi Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:22 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:22 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:22 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:22 crc kubenswrapper[5008]: # support updates Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.315574 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-3502-account-create-update-qzqhp" podUID="8026d1a2-1e3a-4930-9424-56565551f4bb" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.318048 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "68b393c9-78fb-4bde-930d-6af4b840f9e3" (UID: "68b393c9-78fb-4bde-930d-6af4b840f9e3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.328038 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68b393c9-78fb-4bde-930d-6af4b840f9e3" (UID: "68b393c9-78fb-4bde-930d-6af4b840f9e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.331017 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:22 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: if [ -n "neutron" ]; then Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="neutron" Mar 18 18:27:22 crc kubenswrapper[5008]: else Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:22 crc kubenswrapper[5008]: fi Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:22 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:22 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:22 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:22 crc kubenswrapper[5008]: # support updates Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.331082 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.331403 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.331429 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvcqt\" (UniqueName: \"kubernetes.io/projected/defaf26d-efb3-4ab4-96fb-fe8826988fe1-kube-api-access-hvcqt\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.331443 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.331452 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.331461 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defaf26d-efb3-4ab4-96fb-fe8826988fe1-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.331470 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.332850 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-9abb-account-create-update-z2jlj" podUID="efb8b8df-bedb-4e35-b709-25e83be00470" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.334251 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05c9a5fb-e4e1-4d69-b790-2f5890b62aa8" path="/var/lib/kubelet/pods/05c9a5fb-e4e1-4d69-b790-2f5890b62aa8/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.336630 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0649a6ec-1562-4617-9af7-0dafa2e201eb" path="/var/lib/kubelet/pods/0649a6ec-1562-4617-9af7-0dafa2e201eb/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.337473 5008 generic.go:334] "Generic (PLEG): container finished" podID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerID="4efce2fc93ac7a338b3af0f031e3448a62c3f819288460beeb882dd6abd7cbe9" exitCode=143 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.337991 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1316c67d-d795-4340-88e4-918b6291d950" path="/var/lib/kubelet/pods/1316c67d-d795-4340-88e4-918b6291d950/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.340206 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e0cfb4d-8438-45bc-882c-27c9544b40a5" path="/var/lib/kubelet/pods/1e0cfb4d-8438-45bc-882c-27c9544b40a5/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.343085 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315d5f4a-5139-47d4-8aaf-c3088d6eae91" path="/var/lib/kubelet/pods/315d5f4a-5139-47d4-8aaf-c3088d6eae91/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.345245 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496ae433-798d-40a6-b049-d0f33f87b5b4" path="/var/lib/kubelet/pods/496ae433-798d-40a6-b049-d0f33f87b5b4/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.345642 5008 generic.go:334] "Generic (PLEG): container finished" podID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerID="48672c4f4a417aeb7d70b46843dbaf3f5264f47434917232456154f0d644258b" exitCode=143 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.346933 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad5b158-154c-4219-8a1d-d6df23e11d42" path="/var/lib/kubelet/pods/4ad5b158-154c-4219-8a1d-d6df23e11d42/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.347503 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c40e7e7-0a01-41e4-a1e0-b30f415be2d2" path="/var/lib/kubelet/pods/4c40e7e7-0a01-41e4-a1e0-b30f415be2d2/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.348403 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6030a43a-5893-4e6d-919a-c6c0ca0832a8" path="/var/lib/kubelet/pods/6030a43a-5893-4e6d-919a-c6c0ca0832a8/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.349082 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6daf86c5-b733-40a0-a1e7-0991e59f4b80" path="/var/lib/kubelet/pods/6daf86c5-b733-40a0-a1e7-0991e59f4b80/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.350980 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7299f042-11a3-4875-a5f8-59f18eb2df32" path="/var/lib/kubelet/pods/7299f042-11a3-4875-a5f8-59f18eb2df32/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.351610 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b986d5-bdff-4a92-bec9-27511e91dd2b" path="/var/lib/kubelet/pods/97b986d5-bdff-4a92-bec9-27511e91dd2b/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.352815 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a03defc9-9b67-47f0-b87a-ed5345e84c18" path="/var/lib/kubelet/pods/a03defc9-9b67-47f0-b87a-ed5345e84c18/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.355187 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68b393c9-78fb-4bde-930d-6af4b840f9e3" (UID: "68b393c9-78fb-4bde-930d-6af4b840f9e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.356374 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a32dc881-7587-4467-a24e-80483bbd29c4" path="/var/lib/kubelet/pods/a32dc881-7587-4467-a24e-80483bbd29c4/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.357827 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b" path="/var/lib/kubelet/pods/acf55a4f-c9d5-4806-9aa9-dbe20a61cc0b/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.358449 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0593bf5-e080-4ed8-a376-f73ad47f5086" path="/var/lib/kubelet/pods/b0593bf5-e080-4ed8-a376-f73ad47f5086/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.359324 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba33af3d-82b3-406f-9b9d-9511c8e874c1" path="/var/lib/kubelet/pods/ba33af3d-82b3-406f-9b9d-9511c8e874c1/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.359527 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.360688 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed54cd2-a411-4362-a7b1-7fab16ba8b6b" path="/var/lib/kubelet/pods/bed54cd2-a411-4362-a7b1-7fab16ba8b6b/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.361642 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf33fcec-6589-4b9b-8271-7f51af7ae085" path="/var/lib/kubelet/pods/bf33fcec-6589-4b9b-8271-7f51af7ae085/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.362235 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f" path="/var/lib/kubelet/pods/e6acbf82-1c09-4eb0-b175-5f8a8a5e8d1f/volumes" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.379304 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "defaf26d-efb3-4ab4-96fb-fe8826988fe1" (UID: "defaf26d-efb3-4ab4-96fb-fe8826988fe1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.407825 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-config" (OuterVolumeSpecName: "config") pod "68b393c9-78fb-4bde-930d-6af4b840f9e3" (UID: "68b393c9-78fb-4bde-930d-6af4b840f9e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.432703 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.432744 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.432755 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.432765 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.432777 5008 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.432787 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b393c9-78fb-4bde-930d-6af4b840f9e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.432854 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.432905 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data podName:3d5f0191-2702-46ed-ab82-e8c93ec1cf02 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:24.432888885 +0000 UTC m=+1500.952361964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data") pod "rabbitmq-server-0" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02") : configmap "rabbitmq-config-data" not found Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.534816 5008 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.534868 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts podName:8724ccad-851e-4efc-ad3c-d34252a3f29f nodeName:}" failed. No retries permitted until 2026-03-18 18:27:23.534855879 +0000 UTC m=+1500.054328958 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts") pod "root-account-create-update-j5dtq" (UID: "8724ccad-851e-4efc-ad3c-d34252a3f29f") : configmap "openstack-cell1-scripts" not found Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.535875 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:22 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: if [ -n "nova_cell1" ]; then Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="nova_cell1" Mar 18 18:27:22 crc kubenswrapper[5008]: else Mar 18 18:27:22 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:22 crc kubenswrapper[5008]: fi Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:22 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:22 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:22 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:22 crc kubenswrapper[5008]: # support updates Mar 18 18:27:22 crc kubenswrapper[5008]: Mar 18 18:27:22 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.538825 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-ae04-account-create-update-z85w5" podUID="5bc70fb8-0d00-4c19-a4d2-1721527c51e1" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.542750 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "defaf26d-efb3-4ab4-96fb-fe8826988fe1" (UID: "defaf26d-efb3-4ab4-96fb-fe8826988fe1"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590237 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590282 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-79e6-account-create-update-jqqxt" event={"ID":"1d951b25-e886-44c9-b7f7-d60853e1e0a9","Type":"ContainerStarted","Data":"8ae3752dcf688ed68d1f4db12af538ffb3d85c59068c483d0dcc7ed00b18b023"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590300 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-j5dtq"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590312 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590329 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590418 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-z57k6"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590535 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"defaf26d-efb3-4ab4-96fb-fe8826988fe1","Type":"ContainerDied","Data":"5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590573 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3502-account-create-update-qzqhp"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590589 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"defaf26d-efb3-4ab4-96fb-fe8826988fe1","Type":"ContainerDied","Data":"afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590600 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"582dafe2-2020-4966-921d-cc5e9f0db46c","Type":"ContainerDied","Data":"4efce2fc93ac7a338b3af0f031e3448a62c3f819288460beeb882dd6abd7cbe9"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590612 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8lvzk"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590624 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590640 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8lvzk"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590651 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7bp5n"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590661 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8679cebf-8eea-45ae-be70-26eea9396f8e","Type":"ContainerDied","Data":"48672c4f4a417aeb7d70b46843dbaf3f5264f47434917232456154f0d644258b"} Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590674 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7bp5n"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590685 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590696 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-efa2-account-create-update-pwnw8"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.590849 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="ed55404d-2d05-4776-abed-7579ae87933d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.591059 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7ab5f625-144a-4c7c-bab8-5399de3b5a8e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c2dddb0fc09ab3bed8b4a15346419bed092d4dcea57bf35b28cdac3523e7802a" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.591271 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="dd462bb4-44f5-4e0f-bc17-53d24604d474" containerName="nova-cell1-conductor-conductor" containerID="cri-o://efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.591342 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f26207e6-102f-4160-be7d-e1cad865fcc6" containerName="nova-scheduler-scheduler" containerID="cri-o://2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1" gracePeriod=30 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.610529 5008 scope.go:117] "RemoveContainer" containerID="82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.648135 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/defaf26d-efb3-4ab4-96fb-fe8826988fe1-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.685987 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-z85w5"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.690300 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" containerName="rabbitmq" containerID="cri-o://de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795" gracePeriod=604800 Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.693257 5008 scope.go:117] "RemoveContainer" containerID="82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.699952 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae\": container with ID starting with 82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae not found: ID does not exist" containerID="82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.699987 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae"} err="failed to get container status \"82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae\": rpc error: code = NotFound desc = could not find container \"82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae\": container with ID starting with 82263332669d05ad9b24a3bc4c19472247748eb643154b4c9d4bda10744e11ae not found: ID does not exist" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.700011 5008 scope.go:117] "RemoveContainer" containerID="3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.706834 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.720366 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.731626 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-78xsw"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.748361 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-78xsw"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.754454 5008 scope.go:117] "RemoveContainer" containerID="3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.755910 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959\": container with ID starting with 3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959 not found: ID does not exist" containerID="3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.755991 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959"} err="failed to get container status \"3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959\": rpc error: code = NotFound desc = could not find container \"3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959\": container with ID starting with 3a98cf2e27abc88c220c6aeb0751a297dd22a0499c8e9a35f107d7f5b969d959 not found: ID does not exist" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.756025 5008 scope.go:117] "RemoveContainer" containerID="5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.756237 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.759681 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.763700 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.765170 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.767692 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.767760 5008 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="dd462bb4-44f5-4e0f-bc17-53d24604d474" containerName="nova-cell1-conductor-conductor" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.780337 5008 scope.go:117] "RemoveContainer" containerID="afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.798470 5008 scope.go:117] "RemoveContainer" containerID="5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.799275 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0\": container with ID starting with 5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0 not found: ID does not exist" containerID="5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.799304 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0"} err="failed to get container status \"5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0\": rpc error: code = NotFound desc = could not find container \"5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0\": container with ID starting with 5baba2548d7c511ef77d2873377d1b6c415a3d1316f182988c6113417f3cbbc0 not found: ID does not exist" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.799326 5008 scope.go:117] "RemoveContainer" containerID="afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54" Mar 18 18:27:22 crc kubenswrapper[5008]: E0318 18:27:22.799858 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54\": container with ID starting with afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54 not found: ID does not exist" containerID="afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54" Mar 18 18:27:22 crc kubenswrapper[5008]: I0318 18:27:22.799883 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54"} err="failed to get container status \"afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54\": rpc error: code = NotFound desc = could not find container \"afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54\": container with ID starting with afc647e6f53441b2ed4a952053024ddee85112be72f07607702e2b26cfdd0f54 not found: ID does not exist" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.059219 5008 secret.go:188] Couldn't get secret openstack/nova-cell1-novncproxy-config-data: secret "nova-cell1-novncproxy-config-data" not found Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.059297 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data podName:7ab5f625-144a-4c7c-bab8-5399de3b5a8e nodeName:}" failed. No retries permitted until 2026-03-18 18:27:25.059278544 +0000 UTC m=+1501.578751623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data") pod "nova-cell1-novncproxy-0" (UID: "7ab5f625-144a-4c7c-bab8-5399de3b5a8e") : secret "nova-cell1-novncproxy-config-data" not found Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.072508 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.160815 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-etc-swift\") pod \"e3448bfb-b04e-4f59-b275-45ae07178640\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.160877 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4b8l\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-kube-api-access-b4b8l\") pod \"e3448bfb-b04e-4f59-b275-45ae07178640\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.160939 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-config-data\") pod \"e3448bfb-b04e-4f59-b275-45ae07178640\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.160964 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-combined-ca-bundle\") pod \"e3448bfb-b04e-4f59-b275-45ae07178640\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.161034 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-run-httpd\") pod \"e3448bfb-b04e-4f59-b275-45ae07178640\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.161097 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-internal-tls-certs\") pod \"e3448bfb-b04e-4f59-b275-45ae07178640\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.161119 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-log-httpd\") pod \"e3448bfb-b04e-4f59-b275-45ae07178640\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.161151 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-public-tls-certs\") pod \"e3448bfb-b04e-4f59-b275-45ae07178640\" (UID: \"e3448bfb-b04e-4f59-b275-45ae07178640\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.162088 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e3448bfb-b04e-4f59-b275-45ae07178640" (UID: "e3448bfb-b04e-4f59-b275-45ae07178640"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.162332 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e3448bfb-b04e-4f59-b275-45ae07178640" (UID: "e3448bfb-b04e-4f59-b275-45ae07178640"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.169701 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-kube-api-access-b4b8l" (OuterVolumeSpecName: "kube-api-access-b4b8l") pod "e3448bfb-b04e-4f59-b275-45ae07178640" (UID: "e3448bfb-b04e-4f59-b275-45ae07178640"). InnerVolumeSpecName "kube-api-access-b4b8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.169863 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e3448bfb-b04e-4f59-b275-45ae07178640" (UID: "e3448bfb-b04e-4f59-b275-45ae07178640"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.213634 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-config-data" (OuterVolumeSpecName: "config-data") pod "e3448bfb-b04e-4f59-b275-45ae07178640" (UID: "e3448bfb-b04e-4f59-b275-45ae07178640"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.214570 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e3448bfb-b04e-4f59-b275-45ae07178640" (UID: "e3448bfb-b04e-4f59-b275-45ae07178640"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.217476 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e3448bfb-b04e-4f59-b275-45ae07178640" (UID: "e3448bfb-b04e-4f59-b275-45ae07178640"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.275422 5008 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.275954 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4b8l\" (UniqueName: \"kubernetes.io/projected/e3448bfb-b04e-4f59-b275-45ae07178640-kube-api-access-b4b8l\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.275969 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.275986 5008 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.275999 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.276037 5008 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3448bfb-b04e-4f59-b275-45ae07178640-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.276051 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.276149 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.276228 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data podName:b60d757b-db66-46c1-ad92-4a9e591217a0 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:27.276209517 +0000 UTC m=+1503.795682596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0") : configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.279295 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3448bfb-b04e-4f59-b275-45ae07178640" (UID: "e3448bfb-b04e-4f59-b275-45ae07178640"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.314072 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.314356 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="ceilometer-central-agent" containerID="cri-o://c59553cd5157024b46624fb3c5ef58bbf4ed0861562e0adaaec3f71273dc5a65" gracePeriod=30 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.314799 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="proxy-httpd" containerID="cri-o://9af388372b24fa973623ad52fe731bca5ce89b4e94b9e0a247770b6b45d5bade" gracePeriod=30 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.314853 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="sg-core" containerID="cri-o://3a5bb1f257b5ccc8acb612686fc959dffa3d2d0416f76fda835c27174ee06b01" gracePeriod=30 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.314883 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="ceilometer-notification-agent" containerID="cri-o://9c0e2fb1769c01fdd6e7cf853c2f7ba8ba4a7c7185217812a0c6f433539ebd75" gracePeriod=30 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.357563 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.357806 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" containerName="kube-state-metrics" containerID="cri-o://a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603" gracePeriod=30 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.384136 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3448bfb-b04e-4f59-b275-45ae07178640-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.435297 5008 generic.go:334] "Generic (PLEG): container finished" podID="7ab5f625-144a-4c7c-bab8-5399de3b5a8e" containerID="c2dddb0fc09ab3bed8b4a15346419bed092d4dcea57bf35b28cdac3523e7802a" exitCode=0 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.435422 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7ab5f625-144a-4c7c-bab8-5399de3b5a8e","Type":"ContainerDied","Data":"c2dddb0fc09ab3bed8b4a15346419bed092d4dcea57bf35b28cdac3523e7802a"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.453683 5008 generic.go:334] "Generic (PLEG): container finished" podID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerID="f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59" exitCode=143 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.453772 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-ff487fff5-mqmcg" event={"ID":"d67f3431-0e44-4d3c-8aa9-0f3fb176387d","Type":"ContainerDied","Data":"f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.488698 5008 generic.go:334] "Generic (PLEG): container finished" podID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerID="2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941" exitCode=0 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.488805 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9","Type":"ContainerDied","Data":"2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.502136 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0a95-account-create-update-2rjzw"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.538511 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.538766 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="6cd78c73-6590-4035-af7d-357b8451f0ad" containerName="memcached" containerID="cri-o://a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd" gracePeriod=30 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.542213 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ae04-account-create-update-z85w5" event={"ID":"5bc70fb8-0d00-4c19-a4d2-1721527c51e1","Type":"ContainerStarted","Data":"8f6b3910eb050f98f27c41a1a4a017804bda51c07d10e8963b21659b8dfbc552"} Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.575824 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01 is running failed: container process not found" containerID="4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.618669 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0a95-account-create-update-2rjzw"] Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.624610 5008 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.624690 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts podName:8724ccad-851e-4efc-ad3c-d34252a3f29f nodeName:}" failed. No retries permitted until 2026-03-18 18:27:25.624673207 +0000 UTC m=+1502.144146286 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts") pod "root-account-create-update-j5dtq" (UID: "8724ccad-851e-4efc-ad3c-d34252a3f29f") : configmap "openstack-cell1-scripts" not found Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.644987 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01 is running failed: container process not found" containerID="4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.645213 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2f63-account-create-update-z57k6" event={"ID":"a5b1de51-7913-41fc-afd9-b1f901532d03","Type":"ContainerStarted","Data":"73944b9228d1cb29c8a63645c569238f967f3be885d1db23993a400cf00b7f3a"} Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.651419 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01 is running failed: container process not found" containerID="4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.651499 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="07bd6644-ca18-4b8d-ad83-9757257768fb" containerName="galera" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.668778 5008 generic.go:334] "Generic (PLEG): container finished" podID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerID="b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2" exitCode=143 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.669415 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bda3600a-d612-43ec-8b45-77eccc420b0f","Type":"ContainerDied","Data":"b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.696400 5008 generic.go:334] "Generic (PLEG): container finished" podID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerID="ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4" exitCode=143 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.696535 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96efea0e-17ae-49c4-8f5c-b7341def6878","Type":"ContainerDied","Data":"ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.712653 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0a95-account-create-update-wpsgf"] Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.713049 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3448bfb-b04e-4f59-b275-45ae07178640" containerName="proxy-httpd" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713061 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3448bfb-b04e-4f59-b275-45ae07178640" containerName="proxy-httpd" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.713073 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerName="ovsdbserver-sb" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713079 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerName="ovsdbserver-sb" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.713089 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8857503-cb26-46f0-b4a3-e931a9e3f1ed" containerName="openstack-network-exporter" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713094 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8857503-cb26-46f0-b4a3-e931a9e3f1ed" containerName="openstack-network-exporter" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.713109 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b393c9-78fb-4bde-930d-6af4b840f9e3" containerName="init" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713115 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b393c9-78fb-4bde-930d-6af4b840f9e3" containerName="init" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.713128 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerName="ovsdbserver-nb" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713134 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerName="ovsdbserver-nb" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.713143 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerName="openstack-network-exporter" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713149 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerName="openstack-network-exporter" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.713159 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3448bfb-b04e-4f59-b275-45ae07178640" containerName="proxy-server" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713164 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3448bfb-b04e-4f59-b275-45ae07178640" containerName="proxy-server" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.713176 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b393c9-78fb-4bde-930d-6af4b840f9e3" containerName="dnsmasq-dns" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713181 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b393c9-78fb-4bde-930d-6af4b840f9e3" containerName="dnsmasq-dns" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.713199 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerName="openstack-network-exporter" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713204 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerName="openstack-network-exporter" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713371 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerName="ovsdbserver-sb" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713384 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerName="ovsdbserver-nb" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713392 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8857503-cb26-46f0-b4a3-e931a9e3f1ed" containerName="openstack-network-exporter" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713403 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3448bfb-b04e-4f59-b275-45ae07178640" containerName="proxy-server" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713418 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b393c9-78fb-4bde-930d-6af4b840f9e3" containerName="dnsmasq-dns" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713424 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" containerName="openstack-network-exporter" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713432 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" containerName="openstack-network-exporter" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.713444 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3448bfb-b04e-4f59-b275-45ae07178640" containerName="proxy-httpd" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.716406 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.718955 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.719953 5008 generic.go:334] "Generic (PLEG): container finished" podID="07bd6644-ca18-4b8d-ad83-9757257768fb" containerID="4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01" exitCode=0 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.720007 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"07bd6644-ca18-4b8d-ad83-9757257768fb","Type":"ContainerDied","Data":"4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.726470 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" event={"ID":"68b393c9-78fb-4bde-930d-6af4b840f9e3","Type":"ContainerDied","Data":"9122f6f26080b81ea5afa92af72ef472499bf65dadd0621b6646e545ce98fe62"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.726512 5008 scope.go:117] "RemoveContainer" containerID="22db4948be656e49590d67a4b03650ce23c340a3e654f974e8ccf95e3e514659" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.726695 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.726770 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.729568 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a95-account-create-update-wpsgf"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.751379 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vs2sl"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.751405 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3502-account-create-update-qzqhp" event={"ID":"8026d1a2-1e3a-4930-9424-56565551f4bb","Type":"ContainerStarted","Data":"19c05e4dd42f97abf2758c77aae2c62c0164eb23cb3f97c6e0762219ed0ca874"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.768127 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vs2sl"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.783702 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rsqds"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.791043 5008 scope.go:117] "RemoveContainer" containerID="aacd5171f8a4d4ac3c147fedb821e1e7fd55e12771e32ff44716b52bc9ceb37d" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.793097 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.794489 5008 generic.go:334] "Generic (PLEG): container finished" podID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerID="b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686" exitCode=143 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.794721 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" event={"ID":"24a03e07-237e-4583-81b4-8d9aadc76ea3","Type":"ContainerDied","Data":"b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.795376 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7d8b8459c4-2mq5n"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.795885 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7d8b8459c4-2mq5n" podUID="16314cf6-663f-4fa9-a1e7-272c1a183b58" containerName="keystone-api" containerID="cri-o://5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7" gracePeriod=30 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.814162 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rsqds"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.830775 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-combined-ca-bundle\") pod \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.830867 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfx95\" (UniqueName: \"kubernetes.io/projected/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-kube-api-access-hfx95\") pod \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.830950 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data\") pod \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.831049 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-vencrypt-tls-certs\") pod \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.831089 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-nova-novncproxy-tls-certs\") pod \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\" (UID: \"7ab5f625-144a-4c7c-bab8-5399de3b5a8e\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.831349 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts\") pod \"keystone-0a95-account-create-update-wpsgf\" (UID: \"db886769-350c-4f91-a8b7-77b357bc7cda\") " pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.831379 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kml5c\" (UniqueName: \"kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c\") pod \"keystone-0a95-account-create-update-wpsgf\" (UID: \"db886769-350c-4f91-a8b7-77b357bc7cda\") " pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.835339 5008 generic.go:334] "Generic (PLEG): container finished" podID="e3448bfb-b04e-4f59-b275-45ae07178640" containerID="e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781" exitCode=0 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.835369 5008 generic.go:334] "Generic (PLEG): container finished" podID="e3448bfb-b04e-4f59-b275-45ae07178640" containerID="868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307" exitCode=0 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.835490 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.835496 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.835572 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" event={"ID":"e3448bfb-b04e-4f59-b275-45ae07178640","Type":"ContainerDied","Data":"e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.835610 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" event={"ID":"e3448bfb-b04e-4f59-b275-45ae07178640","Type":"ContainerDied","Data":"868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.835620 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cc79d78dc-6z4kh" event={"ID":"e3448bfb-b04e-4f59-b275-45ae07178640","Type":"ContainerDied","Data":"1b4706de9697d4f4dc77da13cd77bdefb9eaad94681e6e2726fbf0b890e52c0a"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.847295 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-kube-api-access-hfx95" (OuterVolumeSpecName: "kube-api-access-hfx95") pod "7ab5f625-144a-4c7c-bab8-5399de3b5a8e" (UID: "7ab5f625-144a-4c7c-bab8-5399de3b5a8e"). InnerVolumeSpecName "kube-api-access-hfx95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.853602 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-q2tvv"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.854690 5008 generic.go:334] "Generic (PLEG): container finished" podID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerID="e741fbe7654b140e43fc28460cedf85595febc32bb65a05c9b3ffb75588298c2" exitCode=143 Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.854767 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77598b888d-8wwqt" event={"ID":"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad","Type":"ContainerDied","Data":"e741fbe7654b140e43fc28460cedf85595febc32bb65a05c9b3ffb75588298c2"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.857322 5008 scope.go:117] "RemoveContainer" containerID="e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.858717 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-efa2-account-create-update-pwnw8" event={"ID":"5de40aba-2f95-4c75-8875-8ffbf5f17898","Type":"ContainerStarted","Data":"152fc62dd5b8ded6f87ce1b1f610375d2f8201b002f451f320956c0f62361ca1"} Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.869376 5008 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-j5dtq" secret="" err="secret \"galera-openstack-cell1-dockercfg-dkzfs\" not found" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.881974 5008 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 18:27:23 crc kubenswrapper[5008]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 18:27:23 crc kubenswrapper[5008]: Mar 18 18:27:23 crc kubenswrapper[5008]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 18:27:23 crc kubenswrapper[5008]: Mar 18 18:27:23 crc kubenswrapper[5008]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 18:27:23 crc kubenswrapper[5008]: Mar 18 18:27:23 crc kubenswrapper[5008]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 18:27:23 crc kubenswrapper[5008]: Mar 18 18:27:23 crc kubenswrapper[5008]: if [ -n "" ]; then Mar 18 18:27:23 crc kubenswrapper[5008]: GRANT_DATABASE="" Mar 18 18:27:23 crc kubenswrapper[5008]: else Mar 18 18:27:23 crc kubenswrapper[5008]: GRANT_DATABASE="*" Mar 18 18:27:23 crc kubenswrapper[5008]: fi Mar 18 18:27:23 crc kubenswrapper[5008]: Mar 18 18:27:23 crc kubenswrapper[5008]: # going for maximum compatibility here: Mar 18 18:27:23 crc kubenswrapper[5008]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 18:27:23 crc kubenswrapper[5008]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 18:27:23 crc kubenswrapper[5008]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 18:27:23 crc kubenswrapper[5008]: # support updates Mar 18 18:27:23 crc kubenswrapper[5008]: Mar 18 18:27:23 crc kubenswrapper[5008]: $MYSQL_CMD < logger="UnhandledError" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.887249 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data" (OuterVolumeSpecName: "config-data") pod "7ab5f625-144a-4c7c-bab8-5399de3b5a8e" (UID: "7ab5f625-144a-4c7c-bab8-5399de3b5a8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.888383 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-j5dtq" podUID="8724ccad-851e-4efc-ad3c-d34252a3f29f" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.897085 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-q2tvv"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.905693 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0a95-account-create-update-wpsgf"] Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.906449 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-kml5c operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-0a95-account-create-update-wpsgf" podUID="db886769-350c-4f91-a8b7-77b357bc7cda" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.907113 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ab5f625-144a-4c7c-bab8-5399de3b5a8e" (UID: "7ab5f625-144a-4c7c-bab8-5399de3b5a8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.916019 5008 scope.go:117] "RemoveContainer" containerID="868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.917038 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "7ab5f625-144a-4c7c-bab8-5399de3b5a8e" (UID: "7ab5f625-144a-4c7c-bab8-5399de3b5a8e"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933171 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-galera-tls-certs\") pod \"07bd6644-ca18-4b8d-ad83-9757257768fb\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933219 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"07bd6644-ca18-4b8d-ad83-9757257768fb\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933243 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-default\") pod \"07bd6644-ca18-4b8d-ad83-9757257768fb\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933278 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb45v\" (UniqueName: \"kubernetes.io/projected/07bd6644-ca18-4b8d-ad83-9757257768fb-kube-api-access-zb45v\") pod \"07bd6644-ca18-4b8d-ad83-9757257768fb\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933335 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-combined-ca-bundle\") pod \"07bd6644-ca18-4b8d-ad83-9757257768fb\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933351 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-operator-scripts\") pod \"07bd6644-ca18-4b8d-ad83-9757257768fb\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933403 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-generated\") pod \"07bd6644-ca18-4b8d-ad83-9757257768fb\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933420 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-kolla-config\") pod \"07bd6644-ca18-4b8d-ad83-9757257768fb\" (UID: \"07bd6644-ca18-4b8d-ad83-9757257768fb\") " Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933617 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts\") pod \"keystone-0a95-account-create-update-wpsgf\" (UID: \"db886769-350c-4f91-a8b7-77b357bc7cda\") " pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.933646 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kml5c\" (UniqueName: \"kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c\") pod \"keystone-0a95-account-create-update-wpsgf\" (UID: \"db886769-350c-4f91-a8b7-77b357bc7cda\") " pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.934144 5008 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.934274 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.934335 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfx95\" (UniqueName: \"kubernetes.io/projected/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-kube-api-access-hfx95\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.934394 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.937717 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "07bd6644-ca18-4b8d-ad83-9757257768fb" (UID: "07bd6644-ca18-4b8d-ad83-9757257768fb"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.938932 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07bd6644-ca18-4b8d-ad83-9757257768fb" (UID: "07bd6644-ca18-4b8d-ad83-9757257768fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.939392 5008 projected.go:194] Error preparing data for projected volume kube-api-access-kml5c for pod openstack/keystone-0a95-account-create-update-wpsgf: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.939445 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c podName:db886769-350c-4f91-a8b7-77b357bc7cda nodeName:}" failed. No retries permitted until 2026-03-18 18:27:24.439429913 +0000 UTC m=+1500.958902992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kml5c" (UniqueName: "kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c") pod "keystone-0a95-account-create-update-wpsgf" (UID: "db886769-350c-4f91-a8b7-77b357bc7cda") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.939853 5008 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.939897 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts podName:db886769-350c-4f91-a8b7-77b357bc7cda nodeName:}" failed. No retries permitted until 2026-03-18 18:27:24.439888385 +0000 UTC m=+1500.959361464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts") pod "keystone-0a95-account-create-update-wpsgf" (UID: "db886769-350c-4f91-a8b7-77b357bc7cda") : configmap "openstack-scripts" not found Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.941000 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "07bd6644-ca18-4b8d-ad83-9757257768fb" (UID: "07bd6644-ca18-4b8d-ad83-9757257768fb"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.942269 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "07bd6644-ca18-4b8d-ad83-9757257768fb" (UID: "07bd6644-ca18-4b8d-ad83-9757257768fb"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.964532 5008 scope.go:117] "RemoveContainer" containerID="e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.966423 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781\": container with ID starting with e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781 not found: ID does not exist" containerID="e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.966456 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781"} err="failed to get container status \"e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781\": rpc error: code = NotFound desc = could not find container \"e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781\": container with ID starting with e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781 not found: ID does not exist" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.966482 5008 scope.go:117] "RemoveContainer" containerID="868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307" Mar 18 18:27:23 crc kubenswrapper[5008]: E0318 18:27:23.966998 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307\": container with ID starting with 868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307 not found: ID does not exist" containerID="868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.967021 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307"} err="failed to get container status \"868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307\": rpc error: code = NotFound desc = could not find container \"868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307\": container with ID starting with 868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307 not found: ID does not exist" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.967033 5008 scope.go:117] "RemoveContainer" containerID="e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.967270 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781"} err="failed to get container status \"e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781\": rpc error: code = NotFound desc = could not find container \"e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781\": container with ID starting with e1508498c903ce3110160d55f0f26f2f8e18cab5da93baa816089e4abb38d781 not found: ID does not exist" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.967290 5008 scope.go:117] "RemoveContainer" containerID="868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.967610 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307"} err="failed to get container status \"868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307\": rpc error: code = NotFound desc = could not find container \"868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307\": container with ID starting with 868073a75a1893f7c2dc226a15d6f9b227e119cff5add27af38db82aa8347307 not found: ID does not exist" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.970906 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07bd6644-ca18-4b8d-ad83-9757257768fb-kube-api-access-zb45v" (OuterVolumeSpecName: "kube-api-access-zb45v") pod "07bd6644-ca18-4b8d-ad83-9757257768fb" (UID: "07bd6644-ca18-4b8d-ad83-9757257768fb"). InnerVolumeSpecName "kube-api-access-zb45v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.970978 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-gfwz9"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.974746 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bd85b459c-gfwz9"] Mar 18 18:27:23 crc kubenswrapper[5008]: I0318 18:27:23.981429 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "07bd6644-ca18-4b8d-ad83-9757257768fb" (UID: "07bd6644-ca18-4b8d-ad83-9757257768fb"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.037397 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.037428 5008 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.037452 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.037462 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.037470 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb45v\" (UniqueName: \"kubernetes.io/projected/07bd6644-ca18-4b8d-ad83-9757257768fb-kube-api-access-zb45v\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.037479 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07bd6644-ca18-4b8d-ad83-9757257768fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.023845 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7cc79d78dc-6z4kh"] Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.044286 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7cc79d78dc-6z4kh"] Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.044641 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07bd6644-ca18-4b8d-ad83-9757257768fb" (UID: "07bd6644-ca18-4b8d-ad83-9757257768fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.047712 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "7ab5f625-144a-4c7c-bab8-5399de3b5a8e" (UID: "7ab5f625-144a-4c7c-bab8-5399de3b5a8e"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.058747 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.065598 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "07bd6644-ca18-4b8d-ad83-9757257768fb" (UID: "07bd6644-ca18-4b8d-ad83-9757257768fb"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.138692 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.138725 5008 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ab5f625-144a-4c7c-bab8-5399de3b5a8e-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.138735 5008 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/07bd6644-ca18-4b8d-ad83-9757257768fb-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.138765 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.143287 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.200434 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="8724770c-4223-4cfe-b35b-be7cd1a6a9ff" containerName="galera" containerID="cri-o://2735f632f3585bddce99d7d606ae426bc322f9f8793ae4e5d5d4ce755bf8652e" gracePeriod=30 Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.218598 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17cfd4aa-d76e-4a7a-a4b1-c772f531ac03" path="/var/lib/kubelet/pods/17cfd4aa-d76e-4a7a-a4b1-c772f531ac03/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.228901 5008 info.go:109] Failed to get network devices: open /sys/class/net/19c05e4dd42f97a/address: no such file or directory Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.230286 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b56aea-4152-4110-8c73-02754afa2807" path="/var/lib/kubelet/pods/47b56aea-4152-4110-8c73-02754afa2807/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.231075 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4edb3df2-7960-412a-ba0f-32bd8fdabc86" path="/var/lib/kubelet/pods/4edb3df2-7960-412a-ba0f-32bd8fdabc86/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.231669 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f847fb-ea4e-4c60-82ab-8401eb7bf256" path="/var/lib/kubelet/pods/51f847fb-ea4e-4c60-82ab-8401eb7bf256/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.234047 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671ab4c3-3b06-428a-83e4-104ca0251bf1" path="/var/lib/kubelet/pods/671ab4c3-3b06-428a-83e4-104ca0251bf1/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.242296 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-config\") pod \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.242462 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-certs\") pod \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.242509 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-combined-ca-bundle\") pod \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.242708 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz9n2\" (UniqueName: \"kubernetes.io/projected/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-api-access-jz9n2\") pod \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\" (UID: \"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.250486 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b393c9-78fb-4bde-930d-6af4b840f9e3" path="/var/lib/kubelet/pods/68b393c9-78fb-4bde-930d-6af4b840f9e3/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.251365 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1a8234-533f-4cd5-9517-52acab86e99f" path="/var/lib/kubelet/pods/9c1a8234-533f-4cd5-9517-52acab86e99f/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.252043 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca61fbb-1714-4b6c-aca8-547813e7d581" path="/var/lib/kubelet/pods/9ca61fbb-1714-4b6c-aca8-547813e7d581/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.253176 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8857503-cb26-46f0-b4a3-e931a9e3f1ed" path="/var/lib/kubelet/pods/a8857503-cb26-46f0-b4a3-e931a9e3f1ed/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.254155 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defaf26d-efb3-4ab4-96fb-fe8826988fe1" path="/var/lib/kubelet/pods/defaf26d-efb3-4ab4-96fb-fe8826988fe1/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.254895 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3448bfb-b04e-4f59-b275-45ae07178640" path="/var/lib/kubelet/pods/e3448bfb-b04e-4f59-b275-45ae07178640/volumes" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.303081 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-api-access-jz9n2" (OuterVolumeSpecName: "kube-api-access-jz9n2") pod "5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" (UID: "5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84"). InnerVolumeSpecName "kube-api-access-jz9n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.375415 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz9n2\" (UniqueName: \"kubernetes.io/projected/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-api-access-jz9n2\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.379806 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" (UID: "5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.382423 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" (UID: "5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.393468 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" (UID: "5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.461970 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.462043 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.477257 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts\") pod \"keystone-0a95-account-create-update-wpsgf\" (UID: \"db886769-350c-4f91-a8b7-77b357bc7cda\") " pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.477300 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kml5c\" (UniqueName: \"kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c\") pod \"keystone-0a95-account-create-update-wpsgf\" (UID: \"db886769-350c-4f91-a8b7-77b357bc7cda\") " pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.477412 5008 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.477460 5008 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.477486 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts podName:db886769-350c-4f91-a8b7-77b357bc7cda nodeName:}" failed. No retries permitted until 2026-03-18 18:27:25.477466289 +0000 UTC m=+1501.996939358 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts") pod "keystone-0a95-account-create-update-wpsgf" (UID: "db886769-350c-4f91-a8b7-77b357bc7cda") : configmap "openstack-scripts" not found Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.477516 5008 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.477523 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.477531 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.477581 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data podName:3d5f0191-2702-46ed-ab82-e8c93ec1cf02 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:28.477566892 +0000 UTC m=+1504.997039971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data") pod "rabbitmq-server-0" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02") : configmap "rabbitmq-config-data" not found Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.480443 5008 projected.go:194] Error preparing data for projected volume kube-api-access-kml5c for pod openstack/keystone-0a95-account-create-update-wpsgf: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.480483 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c podName:db886769-350c-4f91-a8b7-77b357bc7cda nodeName:}" failed. No retries permitted until 2026-03-18 18:27:25.480474159 +0000 UTC m=+1501.999947238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kml5c" (UniqueName: "kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c") pod "keystone-0a95-account-create-update-wpsgf" (UID: "db886769-350c-4f91-a8b7-77b357bc7cda") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.547865 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.560302 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.570011 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.581880 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8026d1a2-1e3a-4930-9424-56565551f4bb-operator-scripts\") pod \"8026d1a2-1e3a-4930-9424-56565551f4bb\" (UID: \"8026d1a2-1e3a-4930-9424-56565551f4bb\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.581943 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8wpf\" (UniqueName: \"kubernetes.io/projected/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-kube-api-access-v8wpf\") pod \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\" (UID: \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.581992 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbgwt\" (UniqueName: \"kubernetes.io/projected/a5b1de51-7913-41fc-afd9-b1f901532d03-kube-api-access-rbgwt\") pod \"a5b1de51-7913-41fc-afd9-b1f901532d03\" (UID: \"a5b1de51-7913-41fc-afd9-b1f901532d03\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.582099 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-operator-scripts\") pod \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\" (UID: \"5bc70fb8-0d00-4c19-a4d2-1721527c51e1\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.582121 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxmv7\" (UniqueName: \"kubernetes.io/projected/8026d1a2-1e3a-4930-9424-56565551f4bb-kube-api-access-nxmv7\") pod \"8026d1a2-1e3a-4930-9424-56565551f4bb\" (UID: \"8026d1a2-1e3a-4930-9424-56565551f4bb\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.582165 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b1de51-7913-41fc-afd9-b1f901532d03-operator-scripts\") pod \"a5b1de51-7913-41fc-afd9-b1f901532d03\" (UID: \"a5b1de51-7913-41fc-afd9-b1f901532d03\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.583785 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5b1de51-7913-41fc-afd9-b1f901532d03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5b1de51-7913-41fc-afd9-b1f901532d03" (UID: "a5b1de51-7913-41fc-afd9-b1f901532d03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.584101 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bc70fb8-0d00-4c19-a4d2-1721527c51e1" (UID: "5bc70fb8-0d00-4c19-a4d2-1721527c51e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.585943 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8026d1a2-1e3a-4930-9424-56565551f4bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8026d1a2-1e3a-4930-9424-56565551f4bb" (UID: "8026d1a2-1e3a-4930-9424-56565551f4bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.586119 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b1de51-7913-41fc-afd9-b1f901532d03-kube-api-access-rbgwt" (OuterVolumeSpecName: "kube-api-access-rbgwt") pod "a5b1de51-7913-41fc-afd9-b1f901532d03" (UID: "a5b1de51-7913-41fc-afd9-b1f901532d03"). InnerVolumeSpecName "kube-api-access-rbgwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.588476 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-kube-api-access-v8wpf" (OuterVolumeSpecName: "kube-api-access-v8wpf") pod "5bc70fb8-0d00-4c19-a4d2-1721527c51e1" (UID: "5bc70fb8-0d00-4c19-a4d2-1721527c51e1"). InnerVolumeSpecName "kube-api-access-v8wpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.609688 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8026d1a2-1e3a-4930-9424-56565551f4bb-kube-api-access-nxmv7" (OuterVolumeSpecName: "kube-api-access-nxmv7") pod "8026d1a2-1e3a-4930-9424-56565551f4bb" (UID: "8026d1a2-1e3a-4930-9424-56565551f4bb"). InnerVolumeSpecName "kube-api-access-nxmv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.683608 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b1de51-7913-41fc-afd9-b1f901532d03-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.685533 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8026d1a2-1e3a-4930-9424-56565551f4bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.685680 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8wpf\" (UniqueName: \"kubernetes.io/projected/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-kube-api-access-v8wpf\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.685766 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbgwt\" (UniqueName: \"kubernetes.io/projected/a5b1de51-7913-41fc-afd9-b1f901532d03-kube-api-access-rbgwt\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.686858 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bc70fb8-0d00-4c19-a4d2-1721527c51e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.686891 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxmv7\" (UniqueName: \"kubernetes.io/projected/8026d1a2-1e3a-4930-9424-56565551f4bb-kube-api-access-nxmv7\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.697089 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.712843 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b86568468-vhc29" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.723487 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.735282 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789232 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-public-tls-certs\") pod \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789272 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-config-data\") pod \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789320 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-combined-ca-bundle\") pod \"6cd78c73-6590-4035-af7d-357b8451f0ad\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789367 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrpss\" (UniqueName: \"kubernetes.io/projected/efb8b8df-bedb-4e35-b709-25e83be00470-kube-api-access-vrpss\") pod \"efb8b8df-bedb-4e35-b709-25e83be00470\" (UID: \"efb8b8df-bedb-4e35-b709-25e83be00470\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789406 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rb8q\" (UniqueName: \"kubernetes.io/projected/5de40aba-2f95-4c75-8875-8ffbf5f17898-kube-api-access-6rb8q\") pod \"5de40aba-2f95-4c75-8875-8ffbf5f17898\" (UID: \"5de40aba-2f95-4c75-8875-8ffbf5f17898\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789455 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-combined-ca-bundle\") pod \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789481 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de40aba-2f95-4c75-8875-8ffbf5f17898-operator-scripts\") pod \"5de40aba-2f95-4c75-8875-8ffbf5f17898\" (UID: \"5de40aba-2f95-4c75-8875-8ffbf5f17898\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789501 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-memcached-tls-certs\") pod \"6cd78c73-6590-4035-af7d-357b8451f0ad\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789585 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-internal-tls-certs\") pod \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789624 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-scripts\") pod \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789643 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-config-data\") pod \"6cd78c73-6590-4035-af7d-357b8451f0ad\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789664 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd649\" (UniqueName: \"kubernetes.io/projected/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-kube-api-access-sd649\") pod \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789719 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb8b8df-bedb-4e35-b709-25e83be00470-operator-scripts\") pod \"efb8b8df-bedb-4e35-b709-25e83be00470\" (UID: \"efb8b8df-bedb-4e35-b709-25e83be00470\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789743 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-kolla-config\") pod \"6cd78c73-6590-4035-af7d-357b8451f0ad\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789760 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-logs\") pod \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\" (UID: \"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.789794 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjhtg\" (UniqueName: \"kubernetes.io/projected/6cd78c73-6590-4035-af7d-357b8451f0ad-kube-api-access-kjhtg\") pod \"6cd78c73-6590-4035-af7d-357b8451f0ad\" (UID: \"6cd78c73-6590-4035-af7d-357b8451f0ad\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.793145 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-config-data" (OuterVolumeSpecName: "config-data") pod "6cd78c73-6590-4035-af7d-357b8451f0ad" (UID: "6cd78c73-6590-4035-af7d-357b8451f0ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.793744 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd78c73-6590-4035-af7d-357b8451f0ad-kube-api-access-kjhtg" (OuterVolumeSpecName: "kube-api-access-kjhtg") pod "6cd78c73-6590-4035-af7d-357b8451f0ad" (UID: "6cd78c73-6590-4035-af7d-357b8451f0ad"). InnerVolumeSpecName "kube-api-access-kjhtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.794097 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb8b8df-bedb-4e35-b709-25e83be00470-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efb8b8df-bedb-4e35-b709-25e83be00470" (UID: "efb8b8df-bedb-4e35-b709-25e83be00470"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.794432 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6cd78c73-6590-4035-af7d-357b8451f0ad" (UID: "6cd78c73-6590-4035-af7d-357b8451f0ad"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.794935 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-logs" (OuterVolumeSpecName: "logs") pod "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" (UID: "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.795415 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5de40aba-2f95-4c75-8875-8ffbf5f17898-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5de40aba-2f95-4c75-8875-8ffbf5f17898" (UID: "5de40aba-2f95-4c75-8875-8ffbf5f17898"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.796201 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb8b8df-bedb-4e35-b709-25e83be00470-kube-api-access-vrpss" (OuterVolumeSpecName: "kube-api-access-vrpss") pod "efb8b8df-bedb-4e35-b709-25e83be00470" (UID: "efb8b8df-bedb-4e35-b709-25e83be00470"). InnerVolumeSpecName "kube-api-access-vrpss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.797292 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.799153 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-scripts" (OuterVolumeSpecName: "scripts") pod "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" (UID: "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.802062 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-kube-api-access-sd649" (OuterVolumeSpecName: "kube-api-access-sd649") pod "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" (UID: "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b"). InnerVolumeSpecName "kube-api-access-sd649". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.815130 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de40aba-2f95-4c75-8875-8ffbf5f17898-kube-api-access-6rb8q" (OuterVolumeSpecName: "kube-api-access-6rb8q") pod "5de40aba-2f95-4c75-8875-8ffbf5f17898" (UID: "5de40aba-2f95-4c75-8875-8ffbf5f17898"). InnerVolumeSpecName "kube-api-access-6rb8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.843314 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-config-data" (OuterVolumeSpecName: "config-data") pod "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" (UID: "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.866629 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "6cd78c73-6590-4035-af7d-357b8451f0ad" (UID: "6cd78c73-6590-4035-af7d-357b8451f0ad"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.868443 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" (UID: "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.890734 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d951b25-e886-44c9-b7f7-d60853e1e0a9-operator-scripts\") pod \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\" (UID: \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.890834 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7csbz\" (UniqueName: \"kubernetes.io/projected/1d951b25-e886-44c9-b7f7-d60853e1e0a9-kube-api-access-7csbz\") pod \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\" (UID: \"1d951b25-e886-44c9-b7f7-d60853e1e0a9\") " Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891169 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrpss\" (UniqueName: \"kubernetes.io/projected/efb8b8df-bedb-4e35-b709-25e83be00470-kube-api-access-vrpss\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891187 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rb8q\" (UniqueName: \"kubernetes.io/projected/5de40aba-2f95-4c75-8875-8ffbf5f17898-kube-api-access-6rb8q\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891196 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891205 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de40aba-2f95-4c75-8875-8ffbf5f17898-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891212 5008 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891221 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891229 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891244 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd649\" (UniqueName: \"kubernetes.io/projected/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-kube-api-access-sd649\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891254 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb8b8df-bedb-4e35-b709-25e83be00470-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891262 5008 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6cd78c73-6590-4035-af7d-357b8451f0ad-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891269 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891277 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjhtg\" (UniqueName: \"kubernetes.io/projected/6cd78c73-6590-4035-af7d-357b8451f0ad-kube-api-access-kjhtg\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891286 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.891344 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d951b25-e886-44c9-b7f7-d60853e1e0a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d951b25-e886-44c9-b7f7-d60853e1e0a9" (UID: "1d951b25-e886-44c9-b7f7-d60853e1e0a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.893803 5008 generic.go:334] "Generic (PLEG): container finished" podID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerID="1bbd2b9a3501779f9dfc17cd725e0dca6af96fcee035c33d265e2278978c5d37" exitCode=0 Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.893872 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8679cebf-8eea-45ae-be70-26eea9396f8e","Type":"ContainerDied","Data":"1bbd2b9a3501779f9dfc17cd725e0dca6af96fcee035c33d265e2278978c5d37"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.897524 5008 generic.go:334] "Generic (PLEG): container finished" podID="5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" containerID="a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603" exitCode=2 Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.897618 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.897636 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84","Type":"ContainerDied","Data":"a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.901396 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84","Type":"ContainerDied","Data":"c9ddd22193772321e5bdebad2a1b11d644150a6ef8f1bbece8f5136d63a3e59d"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.901422 5008 scope.go:117] "RemoveContainer" containerID="a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.903493 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.168:8776/healthcheck\": read tcp 10.217.0.2:57318->10.217.0.168:8776: read: connection reset by peer" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.909710 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d951b25-e886-44c9-b7f7-d60853e1e0a9-kube-api-access-7csbz" (OuterVolumeSpecName: "kube-api-access-7csbz") pod "1d951b25-e886-44c9-b7f7-d60853e1e0a9" (UID: "1d951b25-e886-44c9-b7f7-d60853e1e0a9"). InnerVolumeSpecName "kube-api-access-7csbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.911353 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ae04-account-create-update-z85w5" event={"ID":"5bc70fb8-0d00-4c19-a4d2-1721527c51e1","Type":"ContainerDied","Data":"8f6b3910eb050f98f27c41a1a4a017804bda51c07d10e8963b21659b8dfbc552"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.911446 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ae04-account-create-update-z85w5" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.915072 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-efa2-account-create-update-pwnw8" event={"ID":"5de40aba-2f95-4c75-8875-8ffbf5f17898","Type":"ContainerDied","Data":"152fc62dd5b8ded6f87ce1b1f610375d2f8201b002f451f320956c0f62361ca1"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.915236 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-efa2-account-create-update-pwnw8" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.918591 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9abb-account-create-update-z2jlj" event={"ID":"efb8b8df-bedb-4e35-b709-25e83be00470","Type":"ContainerDied","Data":"127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.918810 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9abb-account-create-update-z2jlj" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.931969 5008 generic.go:334] "Generic (PLEG): container finished" podID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerID="eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa" exitCode=0 Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.932045 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b86568468-vhc29" event={"ID":"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b","Type":"ContainerDied","Data":"eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.932074 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b86568468-vhc29" event={"ID":"1f873fe5-8163-4b6d-8e6d-3a60914c1a3b","Type":"ContainerDied","Data":"c7a0237bc6f31cf971207a3f582cf6691ec65b493d61916de3897a92a4e4d11e"} Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.932106 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1 is running failed: container process not found" containerID="2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.932246 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b86568468-vhc29" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.941225 5008 scope.go:117] "RemoveContainer" containerID="a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603" Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.941214 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1 is running failed: container process not found" containerID="2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.941939 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1 is running failed: container process not found" containerID="2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.941966 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f26207e6-102f-4160-be7d-e1cad865fcc6" containerName="nova-scheduler-scheduler" Mar 18 18:27:24 crc kubenswrapper[5008]: E0318 18:27:24.942163 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603\": container with ID starting with a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603 not found: ID does not exist" containerID="a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.942183 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603"} err="failed to get container status \"a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603\": rpc error: code = NotFound desc = could not find container \"a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603\": container with ID starting with a775bd2328e78c3e94321a78a960def3522b02e55084a83cd87ded4a5eb20603 not found: ID does not exist" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.942200 5008 scope.go:117] "RemoveContainer" containerID="eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.944847 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cd78c73-6590-4035-af7d-357b8451f0ad" (UID: "6cd78c73-6590-4035-af7d-357b8451f0ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.945063 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7ab5f625-144a-4c7c-bab8-5399de3b5a8e","Type":"ContainerDied","Data":"8b2ae319823e9e3c57fabe7c051b54b23f6b58f3454be8120ede9b39f41397d6"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.945444 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.955538 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.967360 5008 generic.go:334] "Generic (PLEG): container finished" podID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerID="9af388372b24fa973623ad52fe731bca5ce89b4e94b9e0a247770b6b45d5bade" exitCode=0 Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.967395 5008 generic.go:334] "Generic (PLEG): container finished" podID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerID="3a5bb1f257b5ccc8acb612686fc959dffa3d2d0416f76fda835c27174ee06b01" exitCode=2 Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.967404 5008 generic.go:334] "Generic (PLEG): container finished" podID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerID="c59553cd5157024b46624fb3c5ef58bbf4ed0861562e0adaaec3f71273dc5a65" exitCode=0 Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.967464 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerDied","Data":"9af388372b24fa973623ad52fe731bca5ce89b4e94b9e0a247770b6b45d5bade"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.967490 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerDied","Data":"3a5bb1f257b5ccc8acb612686fc959dffa3d2d0416f76fda835c27174ee06b01"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.967503 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerDied","Data":"c59553cd5157024b46624fb3c5ef58bbf4ed0861562e0adaaec3f71273dc5a65"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.972930 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2f63-account-create-update-z57k6" event={"ID":"a5b1de51-7913-41fc-afd9-b1f901532d03","Type":"ContainerDied","Data":"73944b9228d1cb29c8a63645c569238f967f3be885d1db23993a400cf00b7f3a"} Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.973010 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2f63-account-create-update-z57k6" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.974311 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.986325 5008 scope.go:117] "RemoveContainer" containerID="41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.991223 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-efa2-account-create-update-pwnw8"] Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.992375 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d951b25-e886-44c9-b7f7-d60853e1e0a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.992389 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7csbz\" (UniqueName: \"kubernetes.io/projected/1d951b25-e886-44c9-b7f7-d60853e1e0a9-kube-api-access-7csbz\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.992398 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd78c73-6590-4035-af7d-357b8451f0ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:24 crc kubenswrapper[5008]: I0318 18:27:24.995861 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" (UID: "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.000441 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-efa2-account-create-update-pwnw8"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.007254 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" (UID: "1f873fe5-8163-4b6d-8e6d-3a60914c1a3b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.007390 5008 generic.go:334] "Generic (PLEG): container finished" podID="f26207e6-102f-4160-be7d-e1cad865fcc6" containerID="2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1" exitCode=0 Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.007518 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f26207e6-102f-4160-be7d-e1cad865fcc6","Type":"ContainerDied","Data":"2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1"} Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.016050 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"07bd6644-ca18-4b8d-ad83-9757257768fb","Type":"ContainerDied","Data":"6b4bbe809426c149275bcc733e3305d775b85bc2c067d17587c0a8995a65da18"} Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.016299 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.017984 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3502-account-create-update-qzqhp" event={"ID":"8026d1a2-1e3a-4930-9424-56565551f4bb","Type":"ContainerDied","Data":"19c05e4dd42f97abf2758c77aae2c62c0164eb23cb3f97c6e0762219ed0ca874"} Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.018028 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3502-account-create-update-qzqhp" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.025007 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-79e6-account-create-update-jqqxt" event={"ID":"1d951b25-e886-44c9-b7f7-d60853e1e0a9","Type":"ContainerDied","Data":"8ae3752dcf688ed68d1f4db12af538ffb3d85c59068c483d0dcc7ed00b18b023"} Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.025109 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-79e6-account-create-update-jqqxt" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.042231 5008 scope.go:117] "RemoveContainer" containerID="eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa" Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.042746 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa\": container with ID starting with eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa not found: ID does not exist" containerID="eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.042785 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa"} err="failed to get container status \"eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa\": rpc error: code = NotFound desc = could not find container \"eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa\": container with ID starting with eb162fa1773c66a4240c2212925b0bc630f29080f6ea79c29b7d5dbe45bff8fa not found: ID does not exist" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.042807 5008 scope.go:117] "RemoveContainer" containerID="41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8" Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.043051 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8\": container with ID starting with 41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8 not found: ID does not exist" containerID="41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.043068 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8"} err="failed to get container status \"41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8\": rpc error: code = NotFound desc = could not find container \"41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8\": container with ID starting with 41318c69eaf0ebfd748306eccddaeed6ec2d4a9388b3be00fa57b1b52a5ad7b8 not found: ID does not exist" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.043080 5008 scope.go:117] "RemoveContainer" containerID="c2dddb0fc09ab3bed8b4a15346419bed092d4dcea57bf35b28cdac3523e7802a" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.043187 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-z85w5"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.049241 5008 generic.go:334] "Generic (PLEG): container finished" podID="6cd78c73-6590-4035-af7d-357b8451f0ad" containerID="a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd" exitCode=0 Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.049355 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.049497 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6cd78c73-6590-4035-af7d-357b8451f0ad","Type":"ContainerDied","Data":"a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd"} Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.049539 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6cd78c73-6590-4035-af7d-357b8451f0ad","Type":"ContainerDied","Data":"6e2596634a432643cb927ad71b51f78ca543e4f3375523619d8da0fd72245781"} Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.060788 5008 generic.go:334] "Generic (PLEG): container finished" podID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerID="7c0883124c7538aea54980f006ad5cde12fc4e637570de29a2ca5d0d49c482e5" exitCode=0 Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.060893 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.060998 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"582dafe2-2020-4966-921d-cc5e9f0db46c","Type":"ContainerDied","Data":"7c0883124c7538aea54980f006ad5cde12fc4e637570de29a2ca5d0d49c482e5"} Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.074218 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ae04-account-create-update-z85w5"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.095121 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.095157 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.118863 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9abb-account-create-update-z2jlj"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.139230 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9abb-account-create-update-z2jlj"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.146476 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.171134 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.173622 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.183171 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.203879 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg2nl\" (UniqueName: \"kubernetes.io/projected/582dafe2-2020-4966-921d-cc5e9f0db46c-kube-api-access-qg2nl\") pod \"582dafe2-2020-4966-921d-cc5e9f0db46c\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.203923 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-logs\") pod \"582dafe2-2020-4966-921d-cc5e9f0db46c\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.203956 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"582dafe2-2020-4966-921d-cc5e9f0db46c\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.203987 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-internal-tls-certs\") pod \"582dafe2-2020-4966-921d-cc5e9f0db46c\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.204013 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-config-data\") pod \"582dafe2-2020-4966-921d-cc5e9f0db46c\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.204032 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-combined-ca-bundle\") pod \"582dafe2-2020-4966-921d-cc5e9f0db46c\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.204086 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-scripts\") pod \"582dafe2-2020-4966-921d-cc5e9f0db46c\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.204142 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-httpd-run\") pod \"582dafe2-2020-4966-921d-cc5e9f0db46c\" (UID: \"582dafe2-2020-4966-921d-cc5e9f0db46c\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.205591 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "582dafe2-2020-4966-921d-cc5e9f0db46c" (UID: "582dafe2-2020-4966-921d-cc5e9f0db46c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.205925 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-logs" (OuterVolumeSpecName: "logs") pod "582dafe2-2020-4966-921d-cc5e9f0db46c" (UID: "582dafe2-2020-4966-921d-cc5e9f0db46c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.208123 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-scripts" (OuterVolumeSpecName: "scripts") pod "582dafe2-2020-4966-921d-cc5e9f0db46c" (UID: "582dafe2-2020-4966-921d-cc5e9f0db46c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.219708 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582dafe2-2020-4966-921d-cc5e9f0db46c-kube-api-access-qg2nl" (OuterVolumeSpecName: "kube-api-access-qg2nl") pod "582dafe2-2020-4966-921d-cc5e9f0db46c" (UID: "582dafe2-2020-4966-921d-cc5e9f0db46c"). InnerVolumeSpecName "kube-api-access-qg2nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.226342 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "582dafe2-2020-4966-921d-cc5e9f0db46c" (UID: "582dafe2-2020-4966-921d-cc5e9f0db46c"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.244833 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.262140 5008 scope.go:117] "RemoveContainer" containerID="4cc6ebdf06122d5824ebe49392c76f20a08604dc1120f4380c5f58b19d7e4d01" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.262176 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "582dafe2-2020-4966-921d-cc5e9f0db46c" (UID: "582dafe2-2020-4966-921d-cc5e9f0db46c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.286882 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.287044 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.289428 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.289893 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.289946 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9qcqj" podUID="aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" containerName="ovn-controller" probeResult="failure" output=< Mar 18 18:27:25 crc kubenswrapper[5008]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Mar 18 18:27:25 crc kubenswrapper[5008]: > Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.289945 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.290823 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.298412 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.310645 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.311078 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg2nl\" (UniqueName: \"kubernetes.io/projected/582dafe2-2020-4966-921d-cc5e9f0db46c-kube-api-access-qg2nl\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.311096 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582dafe2-2020-4966-921d-cc5e9f0db46c-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.311167 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.311180 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.311190 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.313678 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.313784 5008 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.328541 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-config-data" (OuterVolumeSpecName: "config-data") pod "582dafe2-2020-4966-921d-cc5e9f0db46c" (UID: "582dafe2-2020-4966-921d-cc5e9f0db46c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.330011 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3502-account-create-update-qzqhp"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.348931 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3502-account-create-update-qzqhp"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.353384 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.364470 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "582dafe2-2020-4966-921d-cc5e9f0db46c" (UID: "582dafe2-2020-4966-921d-cc5e9f0db46c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.370369 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-z57k6"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.374276 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2f63-account-create-update-z57k6"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.379255 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.391795 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.409610 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-79e6-account-create-update-jqqxt"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.409662 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-79e6-account-create-update-jqqxt"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.414076 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.414316 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.414328 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582dafe2-2020-4966-921d-cc5e9f0db46c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.495652 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77598b888d-8wwqt" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:37974->10.217.0.160:9311: read: connection reset by peer" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.495682 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77598b888d-8wwqt" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:37958->10.217.0.160:9311: read: connection reset by peer" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.519256 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts\") pod \"keystone-0a95-account-create-update-wpsgf\" (UID: \"db886769-350c-4f91-a8b7-77b357bc7cda\") " pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.519303 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kml5c\" (UniqueName: \"kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c\") pod \"keystone-0a95-account-create-update-wpsgf\" (UID: \"db886769-350c-4f91-a8b7-77b357bc7cda\") " pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.519747 5008 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.519955 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts podName:db886769-350c-4f91-a8b7-77b357bc7cda nodeName:}" failed. No retries permitted until 2026-03-18 18:27:27.519933152 +0000 UTC m=+1504.039406241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts") pod "keystone-0a95-account-create-update-wpsgf" (UID: "db886769-350c-4f91-a8b7-77b357bc7cda") : configmap "openstack-scripts" not found Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.522624 5008 projected.go:194] Error preparing data for projected volume kube-api-access-kml5c for pod openstack/keystone-0a95-account-create-update-wpsgf: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.522693 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c podName:db886769-350c-4f91-a8b7-77b357bc7cda nodeName:}" failed. No retries permitted until 2026-03-18 18:27:27.522673024 +0000 UTC m=+1504.042146203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kml5c" (UniqueName: "kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c") pod "keystone-0a95-account-create-update-wpsgf" (UID: "db886769-350c-4f91-a8b7-77b357bc7cda") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.658814 5008 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.658895 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts podName:8724ccad-851e-4efc-ad3c-d34252a3f29f nodeName:}" failed. No retries permitted until 2026-03-18 18:27:29.658873626 +0000 UTC m=+1506.178346805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts") pod "root-account-create-update-j5dtq" (UID: "8724ccad-851e-4efc-ad3c-d34252a3f29f") : configmap "openstack-cell1-scripts" not found Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.689494 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5b86568468-vhc29"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.691406 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.694269 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5b86568468-vhc29"] Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.698425 5008 scope.go:117] "RemoveContainer" containerID="3f3486d7e9bc9fc522f426c4d18938f1448d83d50daec3ae5eec87e410c91eb6" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.700375 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.726938 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.759426 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-scripts\") pod \"8679cebf-8eea-45ae-be70-26eea9396f8e\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.759791 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-logs\") pod \"8679cebf-8eea-45ae-be70-26eea9396f8e\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.759894 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-public-tls-certs\") pod \"8679cebf-8eea-45ae-be70-26eea9396f8e\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.759953 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-combined-ca-bundle\") pod \"f26207e6-102f-4160-be7d-e1cad865fcc6\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760095 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v8wg\" (UniqueName: \"kubernetes.io/projected/8724ccad-851e-4efc-ad3c-d34252a3f29f-kube-api-access-5v8wg\") pod \"8724ccad-851e-4efc-ad3c-d34252a3f29f\" (UID: \"8724ccad-851e-4efc-ad3c-d34252a3f29f\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760178 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-combined-ca-bundle\") pod \"8679cebf-8eea-45ae-be70-26eea9396f8e\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760226 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts\") pod \"8724ccad-851e-4efc-ad3c-d34252a3f29f\" (UID: \"8724ccad-851e-4efc-ad3c-d34252a3f29f\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760278 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-config-data\") pod \"f26207e6-102f-4160-be7d-e1cad865fcc6\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760366 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8679cebf-8eea-45ae-be70-26eea9396f8e\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760435 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-config-data\") pod \"8679cebf-8eea-45ae-be70-26eea9396f8e\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760471 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdh28\" (UniqueName: \"kubernetes.io/projected/f26207e6-102f-4160-be7d-e1cad865fcc6-kube-api-access-pdh28\") pod \"f26207e6-102f-4160-be7d-e1cad865fcc6\" (UID: \"f26207e6-102f-4160-be7d-e1cad865fcc6\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760527 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdvcs\" (UniqueName: \"kubernetes.io/projected/8679cebf-8eea-45ae-be70-26eea9396f8e-kube-api-access-sdvcs\") pod \"8679cebf-8eea-45ae-be70-26eea9396f8e\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760580 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-httpd-run\") pod \"8679cebf-8eea-45ae-be70-26eea9396f8e\" (UID: \"8679cebf-8eea-45ae-be70-26eea9396f8e\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.760927 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-logs" (OuterVolumeSpecName: "logs") pod "8679cebf-8eea-45ae-be70-26eea9396f8e" (UID: "8679cebf-8eea-45ae-be70-26eea9396f8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.761845 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.764926 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-scripts" (OuterVolumeSpecName: "scripts") pod "8679cebf-8eea-45ae-be70-26eea9396f8e" (UID: "8679cebf-8eea-45ae-be70-26eea9396f8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.765167 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8679cebf-8eea-45ae-be70-26eea9396f8e" (UID: "8679cebf-8eea-45ae-be70-26eea9396f8e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.766194 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "8679cebf-8eea-45ae-be70-26eea9396f8e" (UID: "8679cebf-8eea-45ae-be70-26eea9396f8e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.766534 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8724ccad-851e-4efc-ad3c-d34252a3f29f" (UID: "8724ccad-851e-4efc-ad3c-d34252a3f29f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.768938 5008 scope.go:117] "RemoveContainer" containerID="a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.770290 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8679cebf-8eea-45ae-be70-26eea9396f8e-kube-api-access-sdvcs" (OuterVolumeSpecName: "kube-api-access-sdvcs") pod "8679cebf-8eea-45ae-be70-26eea9396f8e" (UID: "8679cebf-8eea-45ae-be70-26eea9396f8e"). InnerVolumeSpecName "kube-api-access-sdvcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.774620 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.774861 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8724ccad-851e-4efc-ad3c-d34252a3f29f-kube-api-access-5v8wg" (OuterVolumeSpecName: "kube-api-access-5v8wg") pod "8724ccad-851e-4efc-ad3c-d34252a3f29f" (UID: "8724ccad-851e-4efc-ad3c-d34252a3f29f"). InnerVolumeSpecName "kube-api-access-5v8wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.784955 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26207e6-102f-4160-be7d-e1cad865fcc6-kube-api-access-pdh28" (OuterVolumeSpecName: "kube-api-access-pdh28") pod "f26207e6-102f-4160-be7d-e1cad865fcc6" (UID: "f26207e6-102f-4160-be7d-e1cad865fcc6"). InnerVolumeSpecName "kube-api-access-pdh28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.862618 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96efea0e-17ae-49c4-8f5c-b7341def6878-logs\") pod \"96efea0e-17ae-49c4-8f5c-b7341def6878\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.862693 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-config-data\") pod \"96efea0e-17ae-49c4-8f5c-b7341def6878\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.862761 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6hx7\" (UniqueName: \"kubernetes.io/projected/96efea0e-17ae-49c4-8f5c-b7341def6878-kube-api-access-r6hx7\") pod \"96efea0e-17ae-49c4-8f5c-b7341def6878\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.862870 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-combined-ca-bundle\") pod \"96efea0e-17ae-49c4-8f5c-b7341def6878\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.862899 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-nova-metadata-tls-certs\") pod \"96efea0e-17ae-49c4-8f5c-b7341def6878\" (UID: \"96efea0e-17ae-49c4-8f5c-b7341def6878\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.863582 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdh28\" (UniqueName: \"kubernetes.io/projected/f26207e6-102f-4160-be7d-e1cad865fcc6-kube-api-access-pdh28\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.863600 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdvcs\" (UniqueName: \"kubernetes.io/projected/8679cebf-8eea-45ae-be70-26eea9396f8e-kube-api-access-sdvcs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.863631 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8679cebf-8eea-45ae-be70-26eea9396f8e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.863645 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.863657 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v8wg\" (UniqueName: \"kubernetes.io/projected/8724ccad-851e-4efc-ad3c-d34252a3f29f-kube-api-access-5v8wg\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.863668 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724ccad-851e-4efc-ad3c-d34252a3f29f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.863693 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.864693 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96efea0e-17ae-49c4-8f5c-b7341def6878-logs" (OuterVolumeSpecName: "logs") pod "96efea0e-17ae-49c4-8f5c-b7341def6878" (UID: "96efea0e-17ae-49c4-8f5c-b7341def6878"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.867792 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8679cebf-8eea-45ae-be70-26eea9396f8e" (UID: "8679cebf-8eea-45ae-be70-26eea9396f8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.867849 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f26207e6-102f-4160-be7d-e1cad865fcc6" (UID: "f26207e6-102f-4160-be7d-e1cad865fcc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.876771 5008 scope.go:117] "RemoveContainer" containerID="a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd" Mar 18 18:27:25 crc kubenswrapper[5008]: E0318 18:27:25.882691 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd\": container with ID starting with a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd not found: ID does not exist" containerID="a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.882749 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd"} err="failed to get container status \"a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd\": rpc error: code = NotFound desc = could not find container \"a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd\": container with ID starting with a10f7834955df6f8ab45ed451b62d37ed1daf41ca59473a2c03038d777c8d5dd not found: ID does not exist" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.885884 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-config-data" (OuterVolumeSpecName: "config-data") pod "f26207e6-102f-4160-be7d-e1cad865fcc6" (UID: "f26207e6-102f-4160-be7d-e1cad865fcc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.917389 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96efea0e-17ae-49c4-8f5c-b7341def6878-kube-api-access-r6hx7" (OuterVolumeSpecName: "kube-api-access-r6hx7") pod "96efea0e-17ae-49c4-8f5c-b7341def6878" (UID: "96efea0e-17ae-49c4-8f5c-b7341def6878"). InnerVolumeSpecName "kube-api-access-r6hx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.929293 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.930303 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.932439 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8679cebf-8eea-45ae-be70-26eea9396f8e" (UID: "8679cebf-8eea-45ae-be70-26eea9396f8e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.936898 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.949952 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "96efea0e-17ae-49c4-8f5c-b7341def6878" (UID: "96efea0e-17ae-49c4-8f5c-b7341def6878"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.972274 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cndf6\" (UniqueName: \"kubernetes.io/projected/d27fb392-40df-45a9-aeae-20781d90f02b-kube-api-access-cndf6\") pod \"d27fb392-40df-45a9-aeae-20781d90f02b\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973247 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d27fb392-40df-45a9-aeae-20781d90f02b-etc-machine-id\") pod \"d27fb392-40df-45a9-aeae-20781d90f02b\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973273 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data-custom\") pod \"d27fb392-40df-45a9-aeae-20781d90f02b\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973324 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxw6x\" (UniqueName: \"kubernetes.io/projected/bda3600a-d612-43ec-8b45-77eccc420b0f-kube-api-access-bxw6x\") pod \"bda3600a-d612-43ec-8b45-77eccc420b0f\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973384 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data\") pod \"d27fb392-40df-45a9-aeae-20781d90f02b\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973584 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d27fb392-40df-45a9-aeae-20781d90f02b-logs\") pod \"d27fb392-40df-45a9-aeae-20781d90f02b\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973636 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-public-tls-certs\") pod \"d27fb392-40df-45a9-aeae-20781d90f02b\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973663 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-combined-ca-bundle\") pod \"d27fb392-40df-45a9-aeae-20781d90f02b\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973693 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-config-data\") pod \"bda3600a-d612-43ec-8b45-77eccc420b0f\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973715 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-combined-ca-bundle\") pod \"bda3600a-d612-43ec-8b45-77eccc420b0f\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973738 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-public-tls-certs\") pod \"bda3600a-d612-43ec-8b45-77eccc420b0f\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973775 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-scripts\") pod \"d27fb392-40df-45a9-aeae-20781d90f02b\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973811 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-internal-tls-certs\") pod \"d27fb392-40df-45a9-aeae-20781d90f02b\" (UID: \"d27fb392-40df-45a9-aeae-20781d90f02b\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973841 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda3600a-d612-43ec-8b45-77eccc420b0f-logs\") pod \"bda3600a-d612-43ec-8b45-77eccc420b0f\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.973879 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-internal-tls-certs\") pod \"bda3600a-d612-43ec-8b45-77eccc420b0f\" (UID: \"bda3600a-d612-43ec-8b45-77eccc420b0f\") " Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.974621 5008 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.974640 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96efea0e-17ae-49c4-8f5c-b7341def6878-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.974652 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.974663 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.974674 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.974684 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26207e6-102f-4160-be7d-e1cad865fcc6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.974697 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6hx7\" (UniqueName: \"kubernetes.io/projected/96efea0e-17ae-49c4-8f5c-b7341def6878-kube-api-access-r6hx7\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.974708 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.978439 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d27fb392-40df-45a9-aeae-20781d90f02b-logs" (OuterVolumeSpecName: "logs") pod "d27fb392-40df-45a9-aeae-20781d90f02b" (UID: "d27fb392-40df-45a9-aeae-20781d90f02b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.986776 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d27fb392-40df-45a9-aeae-20781d90f02b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d27fb392-40df-45a9-aeae-20781d90f02b" (UID: "d27fb392-40df-45a9-aeae-20781d90f02b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.989155 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d27fb392-40df-45a9-aeae-20781d90f02b-kube-api-access-cndf6" (OuterVolumeSpecName: "kube-api-access-cndf6") pod "d27fb392-40df-45a9-aeae-20781d90f02b" (UID: "d27fb392-40df-45a9-aeae-20781d90f02b"). InnerVolumeSpecName "kube-api-access-cndf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.989857 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda3600a-d612-43ec-8b45-77eccc420b0f-logs" (OuterVolumeSpecName: "logs") pod "bda3600a-d612-43ec-8b45-77eccc420b0f" (UID: "bda3600a-d612-43ec-8b45-77eccc420b0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.994980 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda3600a-d612-43ec-8b45-77eccc420b0f-kube-api-access-bxw6x" (OuterVolumeSpecName: "kube-api-access-bxw6x") pod "bda3600a-d612-43ec-8b45-77eccc420b0f" (UID: "bda3600a-d612-43ec-8b45-77eccc420b0f"). InnerVolumeSpecName "kube-api-access-bxw6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.995094 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-config-data" (OuterVolumeSpecName: "config-data") pod "96efea0e-17ae-49c4-8f5c-b7341def6878" (UID: "96efea0e-17ae-49c4-8f5c-b7341def6878"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.996729 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d27fb392-40df-45a9-aeae-20781d90f02b" (UID: "d27fb392-40df-45a9-aeae-20781d90f02b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:25 crc kubenswrapper[5008]: I0318 18:27:25.997041 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-scripts" (OuterVolumeSpecName: "scripts") pod "d27fb392-40df-45a9-aeae-20781d90f02b" (UID: "d27fb392-40df-45a9-aeae-20781d90f02b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.057529 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d27fb392-40df-45a9-aeae-20781d90f02b" (UID: "d27fb392-40df-45a9-aeae-20781d90f02b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.076736 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda3600a-d612-43ec-8b45-77eccc420b0f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.076779 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.076791 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cndf6\" (UniqueName: \"kubernetes.io/projected/d27fb392-40df-45a9-aeae-20781d90f02b-kube-api-access-cndf6\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.076801 5008 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d27fb392-40df-45a9-aeae-20781d90f02b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.076811 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.076822 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxw6x\" (UniqueName: \"kubernetes.io/projected/bda3600a-d612-43ec-8b45-77eccc420b0f-kube-api-access-bxw6x\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.076831 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d27fb392-40df-45a9-aeae-20781d90f02b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.076840 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.076848 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.082985 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-config-data" (OuterVolumeSpecName: "config-data") pod "bda3600a-d612-43ec-8b45-77eccc420b0f" (UID: "bda3600a-d612-43ec-8b45-77eccc420b0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.083091 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bda3600a-d612-43ec-8b45-77eccc420b0f" (UID: "bda3600a-d612-43ec-8b45-77eccc420b0f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.089611 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"582dafe2-2020-4966-921d-cc5e9f0db46c","Type":"ContainerDied","Data":"b02240955d9f245f85f6f408ff050e2a3308eb99e270630612c0e482cdcaa037"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.089673 5008 scope.go:117] "RemoveContainer" containerID="7c0883124c7538aea54980f006ad5cde12fc4e637570de29a2ca5d0d49c482e5" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.089821 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.093294 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f26207e6-102f-4160-be7d-e1cad865fcc6","Type":"ContainerDied","Data":"a545237e6de452fad29c24896e2bdd27a03ed64885aa9d48241b069b1c051e36"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.093379 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.103128 5008 generic.go:334] "Generic (PLEG): container finished" podID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerID="2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931" exitCode=0 Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.103190 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96efea0e-17ae-49c4-8f5c-b7341def6878","Type":"ContainerDied","Data":"2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.103216 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96efea0e-17ae-49c4-8f5c-b7341def6878","Type":"ContainerDied","Data":"7540b2a740d04535fe5ca22edb79d9cf3b3ad9c77e4251988ad1a5c36720936c"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.103282 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.106193 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96efea0e-17ae-49c4-8f5c-b7341def6878" (UID: "96efea0e-17ae-49c4-8f5c-b7341def6878"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.106419 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8679cebf-8eea-45ae-be70-26eea9396f8e","Type":"ContainerDied","Data":"d2758c296b22f071f0cb5ab2f5c2f3b24fad467d0ff8428fcc456dbb490f93c2"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.108336 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.111819 5008 generic.go:334] "Generic (PLEG): container finished" podID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerID="4eadc2692f3a8d44b36203cb5d3923e2c3a539a3f5ba5c413d2a6377e7749375" exitCode=0 Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.111872 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77598b888d-8wwqt" event={"ID":"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad","Type":"ContainerDied","Data":"4eadc2692f3a8d44b36203cb5d3923e2c3a539a3f5ba5c413d2a6377e7749375"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.136521 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data" (OuterVolumeSpecName: "config-data") pod "d27fb392-40df-45a9-aeae-20781d90f02b" (UID: "d27fb392-40df-45a9-aeae-20781d90f02b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.141901 5008 generic.go:334] "Generic (PLEG): container finished" podID="d27fb392-40df-45a9-aeae-20781d90f02b" containerID="890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2" exitCode=0 Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.142018 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d27fb392-40df-45a9-aeae-20781d90f02b","Type":"ContainerDied","Data":"890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.142050 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d27fb392-40df-45a9-aeae-20781d90f02b","Type":"ContainerDied","Data":"f9d031e798aae3fcaedcc5d62a3d819b62b8f95da8741dbe5ae4088bfcd14e75"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.142077 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.146095 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-config-data" (OuterVolumeSpecName: "config-data") pod "8679cebf-8eea-45ae-be70-26eea9396f8e" (UID: "8679cebf-8eea-45ae-be70-26eea9396f8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.152468 5008 generic.go:334] "Generic (PLEG): container finished" podID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerID="4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd" exitCode=0 Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.152656 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.153037 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bda3600a-d612-43ec-8b45-77eccc420b0f","Type":"ContainerDied","Data":"4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.153120 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bda3600a-d612-43ec-8b45-77eccc420b0f","Type":"ContainerDied","Data":"2fabe4b5ebd4874832d3f316148bf564ee1303e60043065e81a128ff85cd3852"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.156899 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bda3600a-d612-43ec-8b45-77eccc420b0f" (UID: "bda3600a-d612-43ec-8b45-77eccc420b0f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.161609 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j5dtq" event={"ID":"8724ccad-851e-4efc-ad3c-d34252a3f29f","Type":"ContainerDied","Data":"f05d286e2f3388afd6a09ef5abb6b12bf7d24c7c48c4d7b66890cc0933acae27"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.161669 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j5dtq" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.170740 5008 generic.go:334] "Generic (PLEG): container finished" podID="8724770c-4223-4cfe-b35b-be7cd1a6a9ff" containerID="2735f632f3585bddce99d7d606ae426bc322f9f8793ae4e5d5d4ce755bf8652e" exitCode=0 Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.170805 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8724770c-4223-4cfe-b35b-be7cd1a6a9ff","Type":"ContainerDied","Data":"2735f632f3585bddce99d7d606ae426bc322f9f8793ae4e5d5d4ce755bf8652e"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.171379 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bda3600a-d612-43ec-8b45-77eccc420b0f" (UID: "bda3600a-d612-43ec-8b45-77eccc420b0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.172354 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d27fb392-40df-45a9-aeae-20781d90f02b" (UID: "d27fb392-40df-45a9-aeae-20781d90f02b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.178333 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.178356 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8679cebf-8eea-45ae-be70-26eea9396f8e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.178366 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.178374 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.178383 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.178392 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.178400 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96efea0e-17ae-49c4-8f5c-b7341def6878-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.178408 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda3600a-d612-43ec-8b45-77eccc420b0f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.180806 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7f1c2fc8-83c6-4183-ac62-f23ad5db8610/ovn-northd/0.log" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.180839 5008 generic.go:334] "Generic (PLEG): container finished" podID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerID="a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac" exitCode=139 Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.180878 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7f1c2fc8-83c6-4183-ac62-f23ad5db8610","Type":"ContainerDied","Data":"a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac"} Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.184289 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a95-account-create-update-wpsgf" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.200952 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.219745 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07bd6644-ca18-4b8d-ad83-9757257768fb" path="/var/lib/kubelet/pods/07bd6644-ca18-4b8d-ad83-9757257768fb/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.220542 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d951b25-e886-44c9-b7f7-d60853e1e0a9" path="/var/lib/kubelet/pods/1d951b25-e886-44c9-b7f7-d60853e1e0a9/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.221034 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" path="/var/lib/kubelet/pods/1f873fe5-8163-4b6d-8e6d-3a60914c1a3b/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.221838 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d27fb392-40df-45a9-aeae-20781d90f02b" (UID: "d27fb392-40df-45a9-aeae-20781d90f02b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.222313 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc70fb8-0d00-4c19-a4d2-1721527c51e1" path="/var/lib/kubelet/pods/5bc70fb8-0d00-4c19-a4d2-1721527c51e1/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.225621 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de40aba-2f95-4c75-8875-8ffbf5f17898" path="/var/lib/kubelet/pods/5de40aba-2f95-4c75-8875-8ffbf5f17898/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.226470 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" path="/var/lib/kubelet/pods/5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.227305 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd78c73-6590-4035-af7d-357b8451f0ad" path="/var/lib/kubelet/pods/6cd78c73-6590-4035-af7d-357b8451f0ad/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.228333 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab5f625-144a-4c7c-bab8-5399de3b5a8e" path="/var/lib/kubelet/pods/7ab5f625-144a-4c7c-bab8-5399de3b5a8e/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.230416 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8026d1a2-1e3a-4930-9424-56565551f4bb" path="/var/lib/kubelet/pods/8026d1a2-1e3a-4930-9424-56565551f4bb/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.231137 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b1de51-7913-41fc-afd9-b1f901532d03" path="/var/lib/kubelet/pods/a5b1de51-7913-41fc-afd9-b1f901532d03/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.242482 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb8b8df-bedb-4e35-b709-25e83be00470" path="/var/lib/kubelet/pods/efb8b8df-bedb-4e35-b709-25e83be00470/volumes" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.269437 5008 scope.go:117] "RemoveContainer" containerID="4efce2fc93ac7a338b3af0f031e3448a62c3f819288460beeb882dd6abd7cbe9" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.284548 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj4gh\" (UniqueName: \"kubernetes.io/projected/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-kube-api-access-zj4gh\") pod \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.284883 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-public-tls-certs\") pod \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.284995 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data\") pod \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.286295 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data-custom\") pod \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.286868 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-combined-ca-bundle\") pod \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.287187 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-logs\") pod \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.287222 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-internal-tls-certs\") pod \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\" (UID: \"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.289315 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-kube-api-access-zj4gh" (OuterVolumeSpecName: "kube-api-access-zj4gh") pod "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" (UID: "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad"). InnerVolumeSpecName "kube-api-access-zj4gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.290149 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-logs" (OuterVolumeSpecName: "logs") pod "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" (UID: "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.301101 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" (UID: "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.302090 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.302121 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d27fb392-40df-45a9-aeae-20781d90f02b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.302135 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.302149 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj4gh\" (UniqueName: \"kubernetes.io/projected/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-kube-api-access-zj4gh\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.344605 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac is running failed: container process not found" containerID="a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.347753 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac is running failed: container process not found" containerID="a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.349600 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" (UID: "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.361915 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac is running failed: container process not found" containerID="a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.361986 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerName="ovn-northd" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.373138 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data" (OuterVolumeSpecName: "config-data") pod "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" (UID: "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.375460 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" (UID: "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.392156 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" (UID: "8b8cadfb-82b5-4427-966d-c3e5bf2a85ad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.404237 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.404270 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.404279 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.404287 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.503524 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7f1c2fc8-83c6-4183-ac62-f23ad5db8610/ovn-northd/0.log" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.503599 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.508266 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.508310 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.508334 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.508350 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.508371 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0a95-account-create-update-wpsgf"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.508385 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0a95-account-create-update-wpsgf"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.508402 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-j5dtq"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.508415 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-j5dtq"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.514779 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.570369 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.571830 5008 scope.go:117] "RemoveContainer" containerID="2481964e9586771d0845e5f21af329f1722fa2234ce3442a00e54b97033e13d1" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.575656 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.595450 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.605090 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.606544 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsbt6\" (UniqueName: \"kubernetes.io/projected/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kube-api-access-hsbt6\") pod \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.606651 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-northd-tls-certs\") pod \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.606931 5008 scope.go:117] "RemoveContainer" containerID="2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.608037 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-galera-tls-certs\") pod \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.608870 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8724770c-4223-4cfe-b35b-be7cd1a6a9ff" (UID: "8724770c-4223-4cfe-b35b-be7cd1a6a9ff"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.609178 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-generated\") pod \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.609284 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-metrics-certs-tls-certs\") pod \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.609614 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-combined-ca-bundle\") pod \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.610489 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.610526 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nc8m\" (UniqueName: \"kubernetes.io/projected/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-kube-api-access-4nc8m\") pod \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.610712 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-combined-ca-bundle\") pod \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.610775 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-operator-scripts\") pod \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.610803 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-scripts\") pod \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.610834 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-default\") pod \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.610871 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kolla-config\") pod \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\" (UID: \"8724770c-4223-4cfe-b35b-be7cd1a6a9ff\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.610930 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-config\") pod \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.610987 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-rundir\") pod \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\" (UID: \"7f1c2fc8-83c6-4183-ac62-f23ad5db8610\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.611250 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kube-api-access-hsbt6" (OuterVolumeSpecName: "kube-api-access-hsbt6") pod "8724770c-4223-4cfe-b35b-be7cd1a6a9ff" (UID: "8724770c-4223-4cfe-b35b-be7cd1a6a9ff"). InnerVolumeSpecName "kube-api-access-hsbt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.612076 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-config" (OuterVolumeSpecName: "config") pod "7f1c2fc8-83c6-4183-ac62-f23ad5db8610" (UID: "7f1c2fc8-83c6-4183-ac62-f23ad5db8610"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.612135 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8724770c-4223-4cfe-b35b-be7cd1a6a9ff" (UID: "8724770c-4223-4cfe-b35b-be7cd1a6a9ff"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.612323 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7f1c2fc8-83c6-4183-ac62-f23ad5db8610" (UID: "7f1c2fc8-83c6-4183-ac62-f23ad5db8610"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.612772 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8724770c-4223-4cfe-b35b-be7cd1a6a9ff" (UID: "8724770c-4223-4cfe-b35b-be7cd1a6a9ff"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.612848 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8724770c-4223-4cfe-b35b-be7cd1a6a9ff" (UID: "8724770c-4223-4cfe-b35b-be7cd1a6a9ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.613252 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db886769-350c-4f91-a8b7-77b357bc7cda-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.613278 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsbt6\" (UniqueName: \"kubernetes.io/projected/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kube-api-access-hsbt6\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.613289 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kml5c\" (UniqueName: \"kubernetes.io/projected/db886769-350c-4f91-a8b7-77b357bc7cda-kube-api-access-kml5c\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.613301 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.614453 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-scripts" (OuterVolumeSpecName: "scripts") pod "7f1c2fc8-83c6-4183-ac62-f23ad5db8610" (UID: "7f1c2fc8-83c6-4183-ac62-f23ad5db8610"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.615319 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-kube-api-access-4nc8m" (OuterVolumeSpecName: "kube-api-access-4nc8m") pod "7f1c2fc8-83c6-4183-ac62-f23ad5db8610" (UID: "7f1c2fc8-83c6-4183-ac62-f23ad5db8610"). InnerVolumeSpecName "kube-api-access-4nc8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.621590 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.623355 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "8724770c-4223-4cfe-b35b-be7cd1a6a9ff" (UID: "8724770c-4223-4cfe-b35b-be7cd1a6a9ff"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.638653 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.666299 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.671293 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.684732 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f1c2fc8-83c6-4183-ac62-f23ad5db8610" (UID: "7f1c2fc8-83c6-4183-ac62-f23ad5db8610"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.689202 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8724770c-4223-4cfe-b35b-be7cd1a6a9ff" (UID: "8724770c-4223-4cfe-b35b-be7cd1a6a9ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.690773 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.699005 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.705694 5008 scope.go:117] "RemoveContainer" containerID="ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714137 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm7dv\" (UniqueName: \"kubernetes.io/projected/24a03e07-237e-4583-81b4-8d9aadc76ea3-kube-api-access-zm7dv\") pod \"24a03e07-237e-4583-81b4-8d9aadc76ea3\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714185 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data\") pod \"24a03e07-237e-4583-81b4-8d9aadc76ea3\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714219 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data\") pod \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714303 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-logs\") pod \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714337 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data-custom\") pod \"24a03e07-237e-4583-81b4-8d9aadc76ea3\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714368 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-combined-ca-bundle\") pod \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714403 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqttq\" (UniqueName: \"kubernetes.io/projected/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-kube-api-access-zqttq\") pod \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714426 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data-custom\") pod \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\" (UID: \"d67f3431-0e44-4d3c-8aa9-0f3fb176387d\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714497 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a03e07-237e-4583-81b4-8d9aadc76ea3-logs\") pod \"24a03e07-237e-4583-81b4-8d9aadc76ea3\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.714536 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-combined-ca-bundle\") pod \"24a03e07-237e-4583-81b4-8d9aadc76ea3\" (UID: \"24a03e07-237e-4583-81b4-8d9aadc76ea3\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.720335 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-logs" (OuterVolumeSpecName: "logs") pod "d67f3431-0e44-4d3c-8aa9-0f3fb176387d" (UID: "d67f3431-0e44-4d3c-8aa9-0f3fb176387d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.720728 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a03e07-237e-4583-81b4-8d9aadc76ea3-logs" (OuterVolumeSpecName: "logs") pod "24a03e07-237e-4583-81b4-8d9aadc76ea3" (UID: "24a03e07-237e-4583-81b4-8d9aadc76ea3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.721876 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-kube-api-access-zqttq" (OuterVolumeSpecName: "kube-api-access-zqttq") pod "d67f3431-0e44-4d3c-8aa9-0f3fb176387d" (UID: "d67f3431-0e44-4d3c-8aa9-0f3fb176387d"). InnerVolumeSpecName "kube-api-access-zqttq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.722798 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.722822 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.722835 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqttq\" (UniqueName: \"kubernetes.io/projected/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-kube-api-access-zqttq\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.727164 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a03e07-237e-4583-81b4-8d9aadc76ea3-kube-api-access-zm7dv" (OuterVolumeSpecName: "kube-api-access-zm7dv") pod "24a03e07-237e-4583-81b4-8d9aadc76ea3" (UID: "24a03e07-237e-4583-81b4-8d9aadc76ea3"). InnerVolumeSpecName "kube-api-access-zm7dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730682 5008 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a03e07-237e-4583-81b4-8d9aadc76ea3-logs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730713 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730750 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730764 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nc8m\" (UniqueName: \"kubernetes.io/projected/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-kube-api-access-4nc8m\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730779 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730795 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730808 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730820 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730831 5008 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.730842 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.731683 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "8724770c-4223-4cfe-b35b-be7cd1a6a9ff" (UID: "8724770c-4223-4cfe-b35b-be7cd1a6a9ff"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.731711 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7f1c2fc8-83c6-4183-ac62-f23ad5db8610" (UID: "7f1c2fc8-83c6-4183-ac62-f23ad5db8610"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.731720 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d67f3431-0e44-4d3c-8aa9-0f3fb176387d" (UID: "d67f3431-0e44-4d3c-8aa9-0f3fb176387d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.733670 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "24a03e07-237e-4583-81b4-8d9aadc76ea3" (UID: "24a03e07-237e-4583-81b4-8d9aadc76ea3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.747775 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "7f1c2fc8-83c6-4183-ac62-f23ad5db8610" (UID: "7f1c2fc8-83c6-4183-ac62-f23ad5db8610"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.757766 5008 scope.go:117] "RemoveContainer" containerID="2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931" Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.758177 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931\": container with ID starting with 2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931 not found: ID does not exist" containerID="2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.758216 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931"} err="failed to get container status \"2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931\": rpc error: code = NotFound desc = could not find container \"2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931\": container with ID starting with 2662973c09a191ca3f45d2dc2e683ec4746e264a085d56eb97c4eff582206931 not found: ID does not exist" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.758244 5008 scope.go:117] "RemoveContainer" containerID="ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4" Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.758603 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4\": container with ID starting with ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4 not found: ID does not exist" containerID="ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.758652 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4"} err="failed to get container status \"ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4\": rpc error: code = NotFound desc = could not find container \"ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4\": container with ID starting with ee06201918dbd5795e039c28701fa7c8e3bc7e2199124f057fd49d0f0f12daa4 not found: ID does not exist" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.758683 5008 scope.go:117] "RemoveContainer" containerID="1bbd2b9a3501779f9dfc17cd725e0dca6af96fcee035c33d265e2278978c5d37" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.765762 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d67f3431-0e44-4d3c-8aa9-0f3fb176387d" (UID: "d67f3431-0e44-4d3c-8aa9-0f3fb176387d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.768127 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.768473 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.769070 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24a03e07-237e-4583-81b4-8d9aadc76ea3" (UID: "24a03e07-237e-4583-81b4-8d9aadc76ea3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.778546 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data" (OuterVolumeSpecName: "config-data") pod "24a03e07-237e-4583-81b4-8d9aadc76ea3" (UID: "24a03e07-237e-4583-81b4-8d9aadc76ea3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.787758 5008 scope.go:117] "RemoveContainer" containerID="48672c4f4a417aeb7d70b46843dbaf3f5264f47434917232456154f0d644258b" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.791622 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data" (OuterVolumeSpecName: "config-data") pod "d67f3431-0e44-4d3c-8aa9-0f3fb176387d" (UID: "d67f3431-0e44-4d3c-8aa9-0f3fb176387d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.807889 5008 scope.go:117] "RemoveContainer" containerID="890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.829017 5008 scope.go:117] "RemoveContainer" containerID="f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832081 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-combined-ca-bundle\") pod \"ed55404d-2d05-4776-abed-7579ae87933d\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832333 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlwq9\" (UniqueName: \"kubernetes.io/projected/ed55404d-2d05-4776-abed-7579ae87933d-kube-api-access-xlwq9\") pod \"ed55404d-2d05-4776-abed-7579ae87933d\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832357 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-config-data\") pod \"ed55404d-2d05-4776-abed-7579ae87933d\" (UID: \"ed55404d-2d05-4776-abed-7579ae87933d\") " Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832631 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832646 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm7dv\" (UniqueName: \"kubernetes.io/projected/24a03e07-237e-4583-81b4-8d9aadc76ea3-kube-api-access-zm7dv\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832657 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832667 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832675 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832683 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832690 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832700 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67f3431-0e44-4d3c-8aa9-0f3fb176387d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832708 5008 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8724770c-4223-4cfe-b35b-be7cd1a6a9ff-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832716 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a03e07-237e-4583-81b4-8d9aadc76ea3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.832724 5008 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1c2fc8-83c6-4183-ac62-f23ad5db8610-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.836062 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed55404d-2d05-4776-abed-7579ae87933d-kube-api-access-xlwq9" (OuterVolumeSpecName: "kube-api-access-xlwq9") pod "ed55404d-2d05-4776-abed-7579ae87933d" (UID: "ed55404d-2d05-4776-abed-7579ae87933d"). InnerVolumeSpecName "kube-api-access-xlwq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.848317 5008 scope.go:117] "RemoveContainer" containerID="890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2" Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.848736 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2\": container with ID starting with 890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2 not found: ID does not exist" containerID="890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.848773 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2"} err="failed to get container status \"890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2\": rpc error: code = NotFound desc = could not find container \"890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2\": container with ID starting with 890384dd81bae30a3f9518e1b6415cdef1c842f2f56865d0c4e3551f7664eff2 not found: ID does not exist" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.848809 5008 scope.go:117] "RemoveContainer" containerID="f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953" Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.849024 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953\": container with ID starting with f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953 not found: ID does not exist" containerID="f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.849041 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953"} err="failed to get container status \"f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953\": rpc error: code = NotFound desc = could not find container \"f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953\": container with ID starting with f41069b2bd17363651059e9da6c0c9d0f44b1b9587760c94ac473bda5a5dc953 not found: ID does not exist" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.849054 5008 scope.go:117] "RemoveContainer" containerID="4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.856329 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed55404d-2d05-4776-abed-7579ae87933d" (UID: "ed55404d-2d05-4776-abed-7579ae87933d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.859519 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-config-data" (OuterVolumeSpecName: "config-data") pod "ed55404d-2d05-4776-abed-7579ae87933d" (UID: "ed55404d-2d05-4776-abed-7579ae87933d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.866205 5008 scope.go:117] "RemoveContainer" containerID="b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.891321 5008 scope.go:117] "RemoveContainer" containerID="4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd" Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.891747 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd\": container with ID starting with 4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd not found: ID does not exist" containerID="4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.891790 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd"} err="failed to get container status \"4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd\": rpc error: code = NotFound desc = could not find container \"4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd\": container with ID starting with 4464660f5642808fbb37e8da2bb43669e2159a066dcb94276d716910e10ff0cd not found: ID does not exist" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.891816 5008 scope.go:117] "RemoveContainer" containerID="b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2" Mar 18 18:27:26 crc kubenswrapper[5008]: E0318 18:27:26.892447 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2\": container with ID starting with b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2 not found: ID does not exist" containerID="b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.892476 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2"} err="failed to get container status \"b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2\": rpc error: code = NotFound desc = could not find container \"b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2\": container with ID starting with b41987f381e2bd0e36d9ddaace1b57cfba485aef900b80078b6110928f0b7ca2 not found: ID does not exist" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.917735 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bd85b459c-gfwz9" podUID="68b393c9-78fb-4bde-930d-6af4b840f9e3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.202:5353: i/o timeout" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.933982 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlwq9\" (UniqueName: \"kubernetes.io/projected/ed55404d-2d05-4776-abed-7579ae87933d-kube-api-access-xlwq9\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.934010 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:26 crc kubenswrapper[5008]: I0318 18:27:26.934022 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed55404d-2d05-4776-abed-7579ae87933d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.201954 5008 generic.go:334] "Generic (PLEG): container finished" podID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerID="dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5" exitCode=0 Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.202038 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" event={"ID":"24a03e07-237e-4583-81b4-8d9aadc76ea3","Type":"ContainerDied","Data":"dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5"} Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.202076 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" event={"ID":"24a03e07-237e-4583-81b4-8d9aadc76ea3","Type":"ContainerDied","Data":"1a6c6e2676c9a628e5d3bbfb8b2283bba22da47044a87021aebcbf9ea7d8b8f2"} Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.202175 5008 scope.go:117] "RemoveContainer" containerID="dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.202328 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-66b589877b-qzcdx" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.218760 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.208:3000/\": dial tcp 10.217.0.208:3000: connect: connection refused" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.226057 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8724770c-4223-4cfe-b35b-be7cd1a6a9ff","Type":"ContainerDied","Data":"349d7e3ce0980f2e4b1f9aea6abbe60c002a19772a32990d7ada7a0288f21bbe"} Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.226154 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.243784 5008 generic.go:334] "Generic (PLEG): container finished" podID="ed55404d-2d05-4776-abed-7579ae87933d" containerID="09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65" exitCode=0 Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.243895 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.243882 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ed55404d-2d05-4776-abed-7579ae87933d","Type":"ContainerDied","Data":"09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65"} Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.244280 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ed55404d-2d05-4776-abed-7579ae87933d","Type":"ContainerDied","Data":"82739a31c47e5120994fbbbb5e23bab6db863c7c56fbd5559b125868e9f21935"} Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.247312 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7f1c2fc8-83c6-4183-ac62-f23ad5db8610/ovn-northd/0.log" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.247383 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7f1c2fc8-83c6-4183-ac62-f23ad5db8610","Type":"ContainerDied","Data":"206cf3d2c63c905b33d66457e0c59d6b9a03aa56d755f65c7bf6f090188ab2df"} Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.247445 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.252096 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77598b888d-8wwqt" event={"ID":"8b8cadfb-82b5-4427-966d-c3e5bf2a85ad","Type":"ContainerDied","Data":"65429ac5f7e421636d76dc0b135212d8027d2f152d47af343cb9b073d37b1fa1"} Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.252176 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77598b888d-8wwqt" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.260769 5008 scope.go:117] "RemoveContainer" containerID="b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.271498 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-66b589877b-qzcdx"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.273431 5008 generic.go:334] "Generic (PLEG): container finished" podID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerID="7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951" exitCode=0 Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.273540 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-ff487fff5-mqmcg" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.273593 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-ff487fff5-mqmcg" event={"ID":"d67f3431-0e44-4d3c-8aa9-0f3fb176387d","Type":"ContainerDied","Data":"7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951"} Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.273641 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-ff487fff5-mqmcg" event={"ID":"d67f3431-0e44-4d3c-8aa9-0f3fb176387d","Type":"ContainerDied","Data":"46bc49800b92d1e06efc1b3f7c4a463b3c5edbbf3daea285cb0ecc49e6cb5b1e"} Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.282732 5008 scope.go:117] "RemoveContainer" containerID="dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5" Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.283116 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5\": container with ID starting with dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5 not found: ID does not exist" containerID="dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.283140 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5"} err="failed to get container status \"dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5\": rpc error: code = NotFound desc = could not find container \"dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5\": container with ID starting with dbb3bbf45e4cb157c1f61921058ede9501e02fa9f0a5deea7ea5db2b1a9c31e5 not found: ID does not exist" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.283157 5008 scope.go:117] "RemoveContainer" containerID="b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686" Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.283409 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686\": container with ID starting with b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686 not found: ID does not exist" containerID="b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.283442 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686"} err="failed to get container status \"b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686\": rpc error: code = NotFound desc = could not find container \"b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686\": container with ID starting with b33ab61dcd597370160ead5da0be76a79fadc5911c77ebcd182c96bc9147d686 not found: ID does not exist" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.283465 5008 scope.go:117] "RemoveContainer" containerID="2735f632f3585bddce99d7d606ae426bc322f9f8793ae4e5d5d4ce755bf8652e" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.293148 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-66b589877b-qzcdx"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.313102 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.317041 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.326052 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77598b888d-8wwqt"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.335697 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-77598b888d-8wwqt"] Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.343050 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.343153 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data podName:b60d757b-db66-46c1-ad92-4a9e591217a0 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:35.343106926 +0000 UTC m=+1511.862580005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data") pod "rabbitmq-cell1-server-0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0") : configmap "rabbitmq-cell1-config-data" not found Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.471103 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.479730 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.485263 5008 scope.go:117] "RemoveContainer" containerID="4c8e9dc94e541b20c0bd59bf7201a375c6c5baca750bed30c0ce7aadbe5de66b" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.486532 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.492993 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.498308 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-ff487fff5-mqmcg"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.503434 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-ff487fff5-mqmcg"] Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.526454 5008 scope.go:117] "RemoveContainer" containerID="09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.555093 5008 scope.go:117] "RemoveContainer" containerID="09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65" Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.557240 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65\": container with ID starting with 09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65 not found: ID does not exist" containerID="09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.557301 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65"} err="failed to get container status \"09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65\": rpc error: code = NotFound desc = could not find container \"09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65\": container with ID starting with 09257f442253d20a46172ce50016f2a69f725c7bb94a9a735847c735bae85e65 not found: ID does not exist" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.557329 5008 scope.go:117] "RemoveContainer" containerID="c17a1b2d5a41cb5fcdc52c5477e557d5c46788d343c6e07f1cda8f8d094698b3" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.578229 5008 scope.go:117] "RemoveContainer" containerID="a68259ee70a42cd608552137a47d03512f5b8b70baec67d791e451ee9cf46bac" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.602530 5008 scope.go:117] "RemoveContainer" containerID="4eadc2692f3a8d44b36203cb5d3923e2c3a539a3f5ba5c413d2a6377e7749375" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.625894 5008 scope.go:117] "RemoveContainer" containerID="e741fbe7654b140e43fc28460cedf85595febc32bb65a05c9b3ffb75588298c2" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.652937 5008 scope.go:117] "RemoveContainer" containerID="7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.674708 5008 scope.go:117] "RemoveContainer" containerID="f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.693047 5008 scope.go:117] "RemoveContainer" containerID="7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951" Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.696920 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951\": container with ID starting with 7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951 not found: ID does not exist" containerID="7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.696986 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951"} err="failed to get container status \"7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951\": rpc error: code = NotFound desc = could not find container \"7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951\": container with ID starting with 7f6d7b25bd945a15d5845ec99b4e2fba05147a61e2fd4a6f4ddb1df33c32d951 not found: ID does not exist" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.697031 5008 scope.go:117] "RemoveContainer" containerID="f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59" Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.697657 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59\": container with ID starting with f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59 not found: ID does not exist" containerID="f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.697711 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59"} err="failed to get container status \"f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59\": rpc error: code = NotFound desc = could not find container \"f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59\": container with ID starting with f7c289c9bf02046bec3de029584d233f16cf2686d6ecf191d260ee9b9b82fc59 not found: ID does not exist" Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.760147 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.762164 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.763347 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 18:27:27 crc kubenswrapper[5008]: E0318 18:27:27.763376 5008 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="dd462bb4-44f5-4e0f-bc17-53d24604d474" containerName="nova-cell1-conductor-conductor" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.797047 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.853239 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brd8q\" (UniqueName: \"kubernetes.io/projected/16314cf6-663f-4fa9-a1e7-272c1a183b58-kube-api-access-brd8q\") pod \"16314cf6-663f-4fa9-a1e7-272c1a183b58\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.853425 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-internal-tls-certs\") pod \"16314cf6-663f-4fa9-a1e7-272c1a183b58\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.853463 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-public-tls-certs\") pod \"16314cf6-663f-4fa9-a1e7-272c1a183b58\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.853582 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-config-data\") pod \"16314cf6-663f-4fa9-a1e7-272c1a183b58\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.853625 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-scripts\") pod \"16314cf6-663f-4fa9-a1e7-272c1a183b58\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.853665 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-credential-keys\") pod \"16314cf6-663f-4fa9-a1e7-272c1a183b58\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.853688 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-fernet-keys\") pod \"16314cf6-663f-4fa9-a1e7-272c1a183b58\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.853719 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-combined-ca-bundle\") pod \"16314cf6-663f-4fa9-a1e7-272c1a183b58\" (UID: \"16314cf6-663f-4fa9-a1e7-272c1a183b58\") " Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.861683 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "16314cf6-663f-4fa9-a1e7-272c1a183b58" (UID: "16314cf6-663f-4fa9-a1e7-272c1a183b58"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.861789 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "16314cf6-663f-4fa9-a1e7-272c1a183b58" (UID: "16314cf6-663f-4fa9-a1e7-272c1a183b58"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.861815 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-scripts" (OuterVolumeSpecName: "scripts") pod "16314cf6-663f-4fa9-a1e7-272c1a183b58" (UID: "16314cf6-663f-4fa9-a1e7-272c1a183b58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.861873 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16314cf6-663f-4fa9-a1e7-272c1a183b58-kube-api-access-brd8q" (OuterVolumeSpecName: "kube-api-access-brd8q") pod "16314cf6-663f-4fa9-a1e7-272c1a183b58" (UID: "16314cf6-663f-4fa9-a1e7-272c1a183b58"). InnerVolumeSpecName "kube-api-access-brd8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.885358 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-config-data" (OuterVolumeSpecName: "config-data") pod "16314cf6-663f-4fa9-a1e7-272c1a183b58" (UID: "16314cf6-663f-4fa9-a1e7-272c1a183b58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.891911 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16314cf6-663f-4fa9-a1e7-272c1a183b58" (UID: "16314cf6-663f-4fa9-a1e7-272c1a183b58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.907805 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16314cf6-663f-4fa9-a1e7-272c1a183b58" (UID: "16314cf6-663f-4fa9-a1e7-272c1a183b58"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.919339 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "16314cf6-663f-4fa9-a1e7-272c1a183b58" (UID: "16314cf6-663f-4fa9-a1e7-272c1a183b58"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.955230 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.955269 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.955282 5008 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.955296 5008 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.955307 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.955320 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brd8q\" (UniqueName: \"kubernetes.io/projected/16314cf6-663f-4fa9-a1e7-272c1a183b58-kube-api-access-brd8q\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.955331 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:27 crc kubenswrapper[5008]: I0318 18:27:27.955339 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16314cf6-663f-4fa9-a1e7-272c1a183b58-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.241747 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a03e07-237e-4583-81b4-8d9aadc76ea3" path="/var/lib/kubelet/pods/24a03e07-237e-4583-81b4-8d9aadc76ea3/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.243869 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582dafe2-2020-4966-921d-cc5e9f0db46c" path="/var/lib/kubelet/pods/582dafe2-2020-4966-921d-cc5e9f0db46c/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.245375 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" path="/var/lib/kubelet/pods/7f1c2fc8-83c6-4183-ac62-f23ad5db8610/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.247719 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8679cebf-8eea-45ae-be70-26eea9396f8e" path="/var/lib/kubelet/pods/8679cebf-8eea-45ae-be70-26eea9396f8e/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.249131 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8724770c-4223-4cfe-b35b-be7cd1a6a9ff" path="/var/lib/kubelet/pods/8724770c-4223-4cfe-b35b-be7cd1a6a9ff/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.252695 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8724ccad-851e-4efc-ad3c-d34252a3f29f" path="/var/lib/kubelet/pods/8724ccad-851e-4efc-ad3c-d34252a3f29f/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.253593 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" path="/var/lib/kubelet/pods/8b8cadfb-82b5-4427-966d-c3e5bf2a85ad/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.254769 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" path="/var/lib/kubelet/pods/96efea0e-17ae-49c4-8f5c-b7341def6878/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.256887 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" path="/var/lib/kubelet/pods/bda3600a-d612-43ec-8b45-77eccc420b0f/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.257890 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" path="/var/lib/kubelet/pods/d27fb392-40df-45a9-aeae-20781d90f02b/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.260143 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" path="/var/lib/kubelet/pods/d67f3431-0e44-4d3c-8aa9-0f3fb176387d/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.261701 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db886769-350c-4f91-a8b7-77b357bc7cda" path="/var/lib/kubelet/pods/db886769-350c-4f91-a8b7-77b357bc7cda/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.262674 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed55404d-2d05-4776-abed-7579ae87933d" path="/var/lib/kubelet/pods/ed55404d-2d05-4776-abed-7579ae87933d/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.264760 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26207e6-102f-4160-be7d-e1cad865fcc6" path="/var/lib/kubelet/pods/f26207e6-102f-4160-be7d-e1cad865fcc6/volumes" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.290452 5008 generic.go:334] "Generic (PLEG): container finished" podID="16314cf6-663f-4fa9-a1e7-272c1a183b58" containerID="5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7" exitCode=0 Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.290508 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d8b8459c4-2mq5n" event={"ID":"16314cf6-663f-4fa9-a1e7-272c1a183b58","Type":"ContainerDied","Data":"5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7"} Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.290584 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d8b8459c4-2mq5n" event={"ID":"16314cf6-663f-4fa9-a1e7-272c1a183b58","Type":"ContainerDied","Data":"8186640c7e91af798f46b247e00dbc88fe015ca6aad2b1642f9b8cdf8f15de9f"} Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.290606 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d8b8459c4-2mq5n" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.290615 5008 scope.go:117] "RemoveContainer" containerID="5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7" Mar 18 18:27:28 crc kubenswrapper[5008]: E0318 18:27:28.321179 5008 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 18 18:27:28 crc kubenswrapper[5008]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-18T18:27:21Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 18 18:27:28 crc kubenswrapper[5008]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Mar 18 18:27:28 crc kubenswrapper[5008]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-9qcqj" message=< Mar 18 18:27:28 crc kubenswrapper[5008]: Exiting ovn-controller (1) [FAILED] Mar 18 18:27:28 crc kubenswrapper[5008]: Killing ovn-controller (1) [ OK ] Mar 18 18:27:28 crc kubenswrapper[5008]: 2026-03-18T18:27:21Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 18 18:27:28 crc kubenswrapper[5008]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Mar 18 18:27:28 crc kubenswrapper[5008]: > Mar 18 18:27:28 crc kubenswrapper[5008]: E0318 18:27:28.321221 5008 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 18 18:27:28 crc kubenswrapper[5008]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-18T18:27:21Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 18 18:27:28 crc kubenswrapper[5008]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Mar 18 18:27:28 crc kubenswrapper[5008]: > pod="openstack/ovn-controller-9qcqj" podUID="aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" containerName="ovn-controller" containerID="cri-o://17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.321259 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-9qcqj" podUID="aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" containerName="ovn-controller" containerID="cri-o://17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f" gracePeriod=23 Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.476927 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7d8b8459c4-2mq5n"] Mar 18 18:27:28 crc kubenswrapper[5008]: E0318 18:27:28.481498 5008 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/b6e5093da8ba4cb2547a74b9c37797a4e8a6548ff78b0570c47f9a3664872903/diff" to get inode usage: stat /var/lib/containers/storage/overlay/b6e5093da8ba4cb2547a74b9c37797a4e8a6548ff78b0570c47f9a3664872903/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ovn-northd-0_7f1c2fc8-83c6-4183-ac62-f23ad5db8610/ovn-northd/0.log" to get inode usage: stat /var/log/pods/openstack_ovn-northd-0_7f1c2fc8-83c6-4183-ac62-f23ad5db8610/ovn-northd/0.log: no such file or directory Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.483125 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7d8b8459c4-2mq5n"] Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.486662 5008 scope.go:117] "RemoveContainer" containerID="5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7" Mar 18 18:27:28 crc kubenswrapper[5008]: E0318 18:27:28.490064 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7\": container with ID starting with 5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7 not found: ID does not exist" containerID="5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.490105 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7"} err="failed to get container status \"5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7\": rpc error: code = NotFound desc = could not find container \"5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7\": container with ID starting with 5398febbc7a97bbef77797f71cbc94ea09c16ee97775bfa2cf52a7892be384b7 not found: ID does not exist" Mar 18 18:27:28 crc kubenswrapper[5008]: E0318 18:27:28.565381 5008 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 18 18:27:28 crc kubenswrapper[5008]: E0318 18:27:28.565473 5008 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data podName:3d5f0191-2702-46ed-ab82-e8c93ec1cf02 nodeName:}" failed. No retries permitted until 2026-03-18 18:27:36.565456137 +0000 UTC m=+1513.084929216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data") pod "rabbitmq-server-0" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02") : configmap "rabbitmq-config-data" not found Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.724197 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.733501 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9qcqj_aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c/ovn-controller/0.log" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.733591 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.768787 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4wzt\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-kube-api-access-k4wzt\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.768847 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7gtp\" (UniqueName: \"kubernetes.io/projected/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-kube-api-access-q7gtp\") pod \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.768881 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-confd\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.768900 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-combined-ca-bundle\") pod \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.768934 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run\") pod \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.768982 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b60d757b-db66-46c1-ad92-4a9e591217a0-pod-info\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769008 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b60d757b-db66-46c1-ad92-4a9e591217a0-erlang-cookie-secret\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769027 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-tls\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769062 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-scripts\") pod \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769109 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-erlang-cookie\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769130 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-plugins-conf\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769151 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-plugins\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769177 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-ovn-controller-tls-certs\") pod \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769195 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-server-conf\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769216 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-log-ovn\") pod \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769234 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769254 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data\") pod \"b60d757b-db66-46c1-ad92-4a9e591217a0\" (UID: \"b60d757b-db66-46c1-ad92-4a9e591217a0\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.769269 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run-ovn\") pod \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\" (UID: \"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c\") " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.770433 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.770966 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.771769 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.774315 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run" (OuterVolumeSpecName: "var-run") pod "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" (UID: "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.774445 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" (UID: "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.775217 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" (UID: "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.775502 5008 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.775521 5008 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.775532 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.775544 5008 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.775657 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.775670 5008 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.777323 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-scripts" (OuterVolumeSpecName: "scripts") pod "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" (UID: "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.777860 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-kube-api-access-k4wzt" (OuterVolumeSpecName: "kube-api-access-k4wzt") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "kube-api-access-k4wzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.788872 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b60d757b-db66-46c1-ad92-4a9e591217a0-pod-info" (OuterVolumeSpecName: "pod-info") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.789041 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.789134 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60d757b-db66-46c1-ad92-4a9e591217a0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.789209 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-kube-api-access-q7gtp" (OuterVolumeSpecName: "kube-api-access-q7gtp") pod "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" (UID: "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c"). InnerVolumeSpecName "kube-api-access-q7gtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.806694 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.808198 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data" (OuterVolumeSpecName: "config-data") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.834088 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" (UID: "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.848141 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-server-conf" (OuterVolumeSpecName: "server-conf") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.874068 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" (UID: "aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877419 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7gtp\" (UniqueName: \"kubernetes.io/projected/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-kube-api-access-q7gtp\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877455 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877466 5008 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b60d757b-db66-46c1-ad92-4a9e591217a0-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877478 5008 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b60d757b-db66-46c1-ad92-4a9e591217a0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877488 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877497 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877509 5008 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877518 5008 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877571 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877581 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b60d757b-db66-46c1-ad92-4a9e591217a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.877592 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4wzt\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-kube-api-access-k4wzt\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.894711 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.895629 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b60d757b-db66-46c1-ad92-4a9e591217a0" (UID: "b60d757b-db66-46c1-ad92-4a9e591217a0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.979464 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:28 crc kubenswrapper[5008]: I0318 18:27:28.979502 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b60d757b-db66-46c1-ad92-4a9e591217a0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.208915 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283024 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283064 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-plugins-conf\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283087 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-erlang-cookie\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283114 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-erlang-cookie-secret\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283146 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-pod-info\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283228 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-server-conf\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283262 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-confd\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283297 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghgp5\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-kube-api-access-ghgp5\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283319 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-plugins\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283364 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-tls\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283412 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data\") pod \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\" (UID: \"3d5f0191-2702-46ed-ab82-e8c93ec1cf02\") " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283673 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.283855 5008 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.286053 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.286714 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.289305 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.289618 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.289923 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.290445 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-kube-api-access-ghgp5" (OuterVolumeSpecName: "kube-api-access-ghgp5") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "kube-api-access-ghgp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.290696 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-pod-info" (OuterVolumeSpecName: "pod-info") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.306168 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data" (OuterVolumeSpecName: "config-data") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.320025 5008 generic.go:334] "Generic (PLEG): container finished" podID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" containerID="de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795" exitCode=0 Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.320087 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3d5f0191-2702-46ed-ab82-e8c93ec1cf02","Type":"ContainerDied","Data":"de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795"} Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.320112 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3d5f0191-2702-46ed-ab82-e8c93ec1cf02","Type":"ContainerDied","Data":"b69518ed40ab043502b3456686e18f20d80a4f35f6635ec351a6747e9c15a891"} Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.320128 5008 scope.go:117] "RemoveContainer" containerID="de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.320152 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.324688 5008 generic.go:334] "Generic (PLEG): container finished" podID="b60d757b-db66-46c1-ad92-4a9e591217a0" containerID="82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411" exitCode=0 Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.324787 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.324823 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b60d757b-db66-46c1-ad92-4a9e591217a0","Type":"ContainerDied","Data":"82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411"} Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.324858 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b60d757b-db66-46c1-ad92-4a9e591217a0","Type":"ContainerDied","Data":"44096718cbe5c33667ee937bbb0cce30deb092bb95ba0d99985feb6af80d0f06"} Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.327041 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9qcqj_aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c/ovn-controller/0.log" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.327083 5008 generic.go:334] "Generic (PLEG): container finished" podID="aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" containerID="17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f" exitCode=139 Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.327109 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj" event={"ID":"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c","Type":"ContainerDied","Data":"17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f"} Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.327129 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9qcqj" event={"ID":"aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c","Type":"ContainerDied","Data":"4410d4ffa290a2192a4a02ef2467e4fab3f152dbd39c8446abba6e1b4840b3e1"} Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.327173 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9qcqj" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.361312 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-server-conf" (OuterVolumeSpecName: "server-conf") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.363115 5008 scope.go:117] "RemoveContainer" containerID="0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.372815 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.385566 5008 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.385892 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghgp5\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-kube-api-access-ghgp5\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.385907 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.385922 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.385933 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.385968 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.385983 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.385995 5008 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.386005 5008 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.390023 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.404242 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.410336 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3d5f0191-2702-46ed-ab82-e8c93ec1cf02" (UID: "3d5f0191-2702-46ed-ab82-e8c93ec1cf02"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.461302 5008 scope.go:117] "RemoveContainer" containerID="de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795" Mar 18 18:27:29 crc kubenswrapper[5008]: E0318 18:27:29.462105 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795\": container with ID starting with de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795 not found: ID does not exist" containerID="de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.462156 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795"} err="failed to get container status \"de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795\": rpc error: code = NotFound desc = could not find container \"de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795\": container with ID starting with de7e46d1ea764d5f5ca940e23f45711a4a0780768418aba0d16fc5bc141f5795 not found: ID does not exist" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.462190 5008 scope.go:117] "RemoveContainer" containerID="0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260" Mar 18 18:27:29 crc kubenswrapper[5008]: E0318 18:27:29.462532 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260\": container with ID starting with 0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260 not found: ID does not exist" containerID="0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.462615 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260"} err="failed to get container status \"0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260\": rpc error: code = NotFound desc = could not find container \"0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260\": container with ID starting with 0fa5685cc03d616eba3ae80660451a0ccfc2e904f623b03160efae22735e2260 not found: ID does not exist" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.462654 5008 scope.go:117] "RemoveContainer" containerID="82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.462800 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9qcqj"] Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.470252 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9qcqj"] Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.489987 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d5f0191-2702-46ed-ab82-e8c93ec1cf02-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.490066 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.493692 5008 scope.go:117] "RemoveContainer" containerID="52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.512735 5008 scope.go:117] "RemoveContainer" containerID="82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411" Mar 18 18:27:29 crc kubenswrapper[5008]: E0318 18:27:29.513371 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411\": container with ID starting with 82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411 not found: ID does not exist" containerID="82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.513421 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411"} err="failed to get container status \"82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411\": rpc error: code = NotFound desc = could not find container \"82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411\": container with ID starting with 82ae68d17e245549f6e1d17dc9fc655108ab134769fb5a66e64e73b24d153411 not found: ID does not exist" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.513453 5008 scope.go:117] "RemoveContainer" containerID="52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2" Mar 18 18:27:29 crc kubenswrapper[5008]: E0318 18:27:29.513740 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2\": container with ID starting with 52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2 not found: ID does not exist" containerID="52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.513763 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2"} err="failed to get container status \"52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2\": rpc error: code = NotFound desc = could not find container \"52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2\": container with ID starting with 52849f9d96333bfa73b7b07421af885ac8db5ecfd31488330c2f6db6ec84ddc2 not found: ID does not exist" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.513776 5008 scope.go:117] "RemoveContainer" containerID="17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.540032 5008 scope.go:117] "RemoveContainer" containerID="17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f" Mar 18 18:27:29 crc kubenswrapper[5008]: E0318 18:27:29.540466 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f\": container with ID starting with 17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f not found: ID does not exist" containerID="17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.540496 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f"} err="failed to get container status \"17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f\": rpc error: code = NotFound desc = could not find container \"17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f\": container with ID starting with 17474046de5efe65995c4c9f491fb8fcb6970e1ae1143d3bc78031d00863e77f not found: ID does not exist" Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.763691 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.774880 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 18:27:29 crc kubenswrapper[5008]: I0318 18:27:29.943231 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.003709 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-config-data\") pod \"dd462bb4-44f5-4e0f-bc17-53d24604d474\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.003753 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-combined-ca-bundle\") pod \"dd462bb4-44f5-4e0f-bc17-53d24604d474\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.003774 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5pt7\" (UniqueName: \"kubernetes.io/projected/dd462bb4-44f5-4e0f-bc17-53d24604d474-kube-api-access-x5pt7\") pod \"dd462bb4-44f5-4e0f-bc17-53d24604d474\" (UID: \"dd462bb4-44f5-4e0f-bc17-53d24604d474\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.009000 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd462bb4-44f5-4e0f-bc17-53d24604d474-kube-api-access-x5pt7" (OuterVolumeSpecName: "kube-api-access-x5pt7") pod "dd462bb4-44f5-4e0f-bc17-53d24604d474" (UID: "dd462bb4-44f5-4e0f-bc17-53d24604d474"). InnerVolumeSpecName "kube-api-access-x5pt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.029620 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-config-data" (OuterVolumeSpecName: "config-data") pod "dd462bb4-44f5-4e0f-bc17-53d24604d474" (UID: "dd462bb4-44f5-4e0f-bc17-53d24604d474"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.031975 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd462bb4-44f5-4e0f-bc17-53d24604d474" (UID: "dd462bb4-44f5-4e0f-bc17-53d24604d474"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.105610 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.105878 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd462bb4-44f5-4e0f-bc17-53d24604d474-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.105957 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5pt7\" (UniqueName: \"kubernetes.io/projected/dd462bb4-44f5-4e0f-bc17-53d24604d474-kube-api-access-x5pt7\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.213179 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16314cf6-663f-4fa9-a1e7-272c1a183b58" path="/var/lib/kubelet/pods/16314cf6-663f-4fa9-a1e7-272c1a183b58/volumes" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.214751 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" path="/var/lib/kubelet/pods/3d5f0191-2702-46ed-ab82-e8c93ec1cf02/volumes" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.215541 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" path="/var/lib/kubelet/pods/aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c/volumes" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.217584 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60d757b-db66-46c1-ad92-4a9e591217a0" path="/var/lib/kubelet/pods/b60d757b-db66-46c1-ad92-4a9e591217a0/volumes" Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.274247 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.276548 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.276785 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.277368 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.277401 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.280342 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.282952 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.283024 5008 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.335963 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb8b8df_bedb_4e35_b709_25e83be00470.slice/crio-127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd462bb4_44f5_4e0f_bc17_53d24604d474.slice/crio-a4872b08063355800bfb36664d6f95c7a3bbc1b3d95899e36c12794882bb383f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd462bb4_44f5_4e0f_bc17_53d24604d474.slice\": RecentStats: unable to find data in memory cache]" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.337438 5008 generic.go:334] "Generic (PLEG): container finished" podID="dd462bb4-44f5-4e0f-bc17-53d24604d474" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" exitCode=0 Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.337504 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd462bb4-44f5-4e0f-bc17-53d24604d474","Type":"ContainerDied","Data":"efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a"} Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.337534 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd462bb4-44f5-4e0f-bc17-53d24604d474","Type":"ContainerDied","Data":"a4872b08063355800bfb36664d6f95c7a3bbc1b3d95899e36c12794882bb383f"} Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.337602 5008 scope.go:117] "RemoveContainer" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.337723 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.344531 5008 generic.go:334] "Generic (PLEG): container finished" podID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerID="9c0e2fb1769c01fdd6e7cf853c2f7ba8ba4a7c7185217812a0c6f433539ebd75" exitCode=0 Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.344674 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerDied","Data":"9c0e2fb1769c01fdd6e7cf853c2f7ba8ba4a7c7185217812a0c6f433539ebd75"} Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.372363 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.380857 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.434781 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.450093 5008 scope.go:117] "RemoveContainer" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" Mar 18 18:27:30 crc kubenswrapper[5008]: E0318 18:27:30.450538 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a\": container with ID starting with efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a not found: ID does not exist" containerID="efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.450695 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a"} err="failed to get container status \"efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a\": rpc error: code = NotFound desc = could not find container \"efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a\": container with ID starting with efab985d43efeeea5313035ebcb8a132e3a8c05197729a71f36b1c61b7901a2a not found: ID does not exist" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.510670 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-sg-core-conf-yaml\") pod \"05f0e04a-507a-42ba-97ff-d91aa199b3db\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.510747 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-log-httpd\") pod \"05f0e04a-507a-42ba-97ff-d91aa199b3db\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.510786 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-run-httpd\") pod \"05f0e04a-507a-42ba-97ff-d91aa199b3db\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.510826 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-config-data\") pod \"05f0e04a-507a-42ba-97ff-d91aa199b3db\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.510855 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-ceilometer-tls-certs\") pod \"05f0e04a-507a-42ba-97ff-d91aa199b3db\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.510884 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzh4j\" (UniqueName: \"kubernetes.io/projected/05f0e04a-507a-42ba-97ff-d91aa199b3db-kube-api-access-rzh4j\") pod \"05f0e04a-507a-42ba-97ff-d91aa199b3db\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.510961 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-combined-ca-bundle\") pod \"05f0e04a-507a-42ba-97ff-d91aa199b3db\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.511004 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-scripts\") pod \"05f0e04a-507a-42ba-97ff-d91aa199b3db\" (UID: \"05f0e04a-507a-42ba-97ff-d91aa199b3db\") " Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.512753 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "05f0e04a-507a-42ba-97ff-d91aa199b3db" (UID: "05f0e04a-507a-42ba-97ff-d91aa199b3db"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.513008 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "05f0e04a-507a-42ba-97ff-d91aa199b3db" (UID: "05f0e04a-507a-42ba-97ff-d91aa199b3db"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.519194 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-scripts" (OuterVolumeSpecName: "scripts") pod "05f0e04a-507a-42ba-97ff-d91aa199b3db" (UID: "05f0e04a-507a-42ba-97ff-d91aa199b3db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.526653 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f0e04a-507a-42ba-97ff-d91aa199b3db-kube-api-access-rzh4j" (OuterVolumeSpecName: "kube-api-access-rzh4j") pod "05f0e04a-507a-42ba-97ff-d91aa199b3db" (UID: "05f0e04a-507a-42ba-97ff-d91aa199b3db"). InnerVolumeSpecName "kube-api-access-rzh4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.534701 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "05f0e04a-507a-42ba-97ff-d91aa199b3db" (UID: "05f0e04a-507a-42ba-97ff-d91aa199b3db"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.555390 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "05f0e04a-507a-42ba-97ff-d91aa199b3db" (UID: "05f0e04a-507a-42ba-97ff-d91aa199b3db"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.570149 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05f0e04a-507a-42ba-97ff-d91aa199b3db" (UID: "05f0e04a-507a-42ba-97ff-d91aa199b3db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.613076 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.613104 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.613115 5008 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.613123 5008 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.613131 5008 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05f0e04a-507a-42ba-97ff-d91aa199b3db-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.613138 5008 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.613148 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzh4j\" (UniqueName: \"kubernetes.io/projected/05f0e04a-507a-42ba-97ff-d91aa199b3db-kube-api-access-rzh4j\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.617161 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-config-data" (OuterVolumeSpecName: "config-data") pod "05f0e04a-507a-42ba-97ff-d91aa199b3db" (UID: "05f0e04a-507a-42ba-97ff-d91aa199b3db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:30 crc kubenswrapper[5008]: I0318 18:27:30.714244 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f0e04a-507a-42ba-97ff-d91aa199b3db-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:31 crc kubenswrapper[5008]: I0318 18:27:31.371492 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05f0e04a-507a-42ba-97ff-d91aa199b3db","Type":"ContainerDied","Data":"87dbc1eb80f4350eb375a5fad29317eccf81cea784572c73702957ff6a703201"} Mar 18 18:27:31 crc kubenswrapper[5008]: I0318 18:27:31.371950 5008 scope.go:117] "RemoveContainer" containerID="9af388372b24fa973623ad52fe731bca5ce89b4e94b9e0a247770b6b45d5bade" Mar 18 18:27:31 crc kubenswrapper[5008]: I0318 18:27:31.372139 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 18:27:31 crc kubenswrapper[5008]: I0318 18:27:31.424208 5008 scope.go:117] "RemoveContainer" containerID="3a5bb1f257b5ccc8acb612686fc959dffa3d2d0416f76fda835c27174ee06b01" Mar 18 18:27:31 crc kubenswrapper[5008]: I0318 18:27:31.425825 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:27:31 crc kubenswrapper[5008]: I0318 18:27:31.432434 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 18:27:31 crc kubenswrapper[5008]: I0318 18:27:31.447433 5008 scope.go:117] "RemoveContainer" containerID="9c0e2fb1769c01fdd6e7cf853c2f7ba8ba4a7c7185217812a0c6f433539ebd75" Mar 18 18:27:31 crc kubenswrapper[5008]: I0318 18:27:31.477726 5008 scope.go:117] "RemoveContainer" containerID="c59553cd5157024b46624fb3c5ef58bbf4ed0861562e0adaaec3f71273dc5a65" Mar 18 18:27:32 crc kubenswrapper[5008]: I0318 18:27:32.217166 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" path="/var/lib/kubelet/pods/05f0e04a-507a-42ba-97ff-d91aa199b3db/volumes" Mar 18 18:27:32 crc kubenswrapper[5008]: I0318 18:27:32.218525 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd462bb4-44f5-4e0f-bc17-53d24604d474" path="/var/lib/kubelet/pods/dd462bb4-44f5-4e0f-bc17-53d24604d474/volumes" Mar 18 18:27:35 crc kubenswrapper[5008]: E0318 18:27:35.273969 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:35 crc kubenswrapper[5008]: E0318 18:27:35.274896 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:35 crc kubenswrapper[5008]: E0318 18:27:35.275547 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:35 crc kubenswrapper[5008]: E0318 18:27:35.275535 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:35 crc kubenswrapper[5008]: E0318 18:27:35.275663 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" Mar 18 18:27:35 crc kubenswrapper[5008]: E0318 18:27:35.277753 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:35 crc kubenswrapper[5008]: E0318 18:27:35.279841 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:35 crc kubenswrapper[5008]: E0318 18:27:35.279893 5008 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.466453 5008 generic.go:334] "Generic (PLEG): container finished" podID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerID="37ee661f7953b8d9a32a6d1f71d0668b96eefbaeef42d315f3b9741a68892653" exitCode=0 Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.466530 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f9f77dc-mg4p7" event={"ID":"0c9299b1-8e15-4e9c-bada-ce88af9c1c28","Type":"ContainerDied","Data":"37ee661f7953b8d9a32a6d1f71d0668b96eefbaeef42d315f3b9741a68892653"} Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.466605 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85f9f77dc-mg4p7" event={"ID":"0c9299b1-8e15-4e9c-bada-ce88af9c1c28","Type":"ContainerDied","Data":"9f24721f654fa721da03e8ccd3a5475770347df8a8b67e723417237b01c2f097"} Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.466629 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f24721f654fa721da03e8ccd3a5475770347df8a8b67e723417237b01c2f097" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.504666 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.630251 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-public-tls-certs\") pod \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.630352 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-httpd-config\") pod \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.630419 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9lfg\" (UniqueName: \"kubernetes.io/projected/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-kube-api-access-t9lfg\") pod \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.630460 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-internal-tls-certs\") pod \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.630501 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-config\") pod \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.630523 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-ovndb-tls-certs\") pod \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.630599 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-combined-ca-bundle\") pod \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\" (UID: \"0c9299b1-8e15-4e9c-bada-ce88af9c1c28\") " Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.636284 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-kube-api-access-t9lfg" (OuterVolumeSpecName: "kube-api-access-t9lfg") pod "0c9299b1-8e15-4e9c-bada-ce88af9c1c28" (UID: "0c9299b1-8e15-4e9c-bada-ce88af9c1c28"). InnerVolumeSpecName "kube-api-access-t9lfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.636536 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0c9299b1-8e15-4e9c-bada-ce88af9c1c28" (UID: "0c9299b1-8e15-4e9c-bada-ce88af9c1c28"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.665508 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c9299b1-8e15-4e9c-bada-ce88af9c1c28" (UID: "0c9299b1-8e15-4e9c-bada-ce88af9c1c28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.666081 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-config" (OuterVolumeSpecName: "config") pod "0c9299b1-8e15-4e9c-bada-ce88af9c1c28" (UID: "0c9299b1-8e15-4e9c-bada-ce88af9c1c28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.667289 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0c9299b1-8e15-4e9c-bada-ce88af9c1c28" (UID: "0c9299b1-8e15-4e9c-bada-ce88af9c1c28"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.675130 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0c9299b1-8e15-4e9c-bada-ce88af9c1c28" (UID: "0c9299b1-8e15-4e9c-bada-ce88af9c1c28"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.689777 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0c9299b1-8e15-4e9c-bada-ce88af9c1c28" (UID: "0c9299b1-8e15-4e9c-bada-ce88af9c1c28"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.732959 5008 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.732995 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9lfg\" (UniqueName: \"kubernetes.io/projected/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-kube-api-access-t9lfg\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.733008 5008 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.733019 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-config\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.733028 5008 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.733036 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:37 crc kubenswrapper[5008]: I0318 18:27:37.733044 5008 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9299b1-8e15-4e9c-bada-ce88af9c1c28-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:38 crc kubenswrapper[5008]: I0318 18:27:38.480695 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85f9f77dc-mg4p7" Mar 18 18:27:38 crc kubenswrapper[5008]: I0318 18:27:38.522135 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-85f9f77dc-mg4p7"] Mar 18 18:27:38 crc kubenswrapper[5008]: I0318 18:27:38.533259 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-85f9f77dc-mg4p7"] Mar 18 18:27:40 crc kubenswrapper[5008]: I0318 18:27:40.214993 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" path="/var/lib/kubelet/pods/0c9299b1-8e15-4e9c-bada-ce88af9c1c28/volumes" Mar 18 18:27:40 crc kubenswrapper[5008]: E0318 18:27:40.273677 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:40 crc kubenswrapper[5008]: E0318 18:27:40.275276 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:40 crc kubenswrapper[5008]: E0318 18:27:40.275794 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:40 crc kubenswrapper[5008]: E0318 18:27:40.275894 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" Mar 18 18:27:40 crc kubenswrapper[5008]: E0318 18:27:40.276068 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:40 crc kubenswrapper[5008]: E0318 18:27:40.277871 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:40 crc kubenswrapper[5008]: E0318 18:27:40.280587 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:40 crc kubenswrapper[5008]: E0318 18:27:40.280660 5008 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" Mar 18 18:27:40 crc kubenswrapper[5008]: E0318 18:27:40.509389 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb8b8df_bedb_4e35_b709_25e83be00470.slice/crio-127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538\": RecentStats: unable to find data in memory cache]" Mar 18 18:27:45 crc kubenswrapper[5008]: E0318 18:27:45.274602 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:45 crc kubenswrapper[5008]: E0318 18:27:45.275828 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:45 crc kubenswrapper[5008]: E0318 18:27:45.276114 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:45 crc kubenswrapper[5008]: E0318 18:27:45.276220 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:45 crc kubenswrapper[5008]: E0318 18:27:45.276260 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" Mar 18 18:27:45 crc kubenswrapper[5008]: E0318 18:27:45.278303 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:45 crc kubenswrapper[5008]: E0318 18:27:45.281446 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:45 crc kubenswrapper[5008]: E0318 18:27:45.281603 5008 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" Mar 18 18:27:50 crc kubenswrapper[5008]: E0318 18:27:50.274669 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:50 crc kubenswrapper[5008]: E0318 18:27:50.276048 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:50 crc kubenswrapper[5008]: E0318 18:27:50.276707 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:50 crc kubenswrapper[5008]: E0318 18:27:50.276681 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 18:27:50 crc kubenswrapper[5008]: E0318 18:27:50.276801 5008 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" Mar 18 18:27:50 crc kubenswrapper[5008]: E0318 18:27:50.279381 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:50 crc kubenswrapper[5008]: E0318 18:27:50.281431 5008 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 18:27:50 crc kubenswrapper[5008]: E0318 18:27:50.281496 5008 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-x8pkm" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" Mar 18 18:27:50 crc kubenswrapper[5008]: E0318 18:27:50.680017 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb8b8df_bedb_4e35_b709_25e83be00470.slice/crio-127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538\": RecentStats: unable to find data in memory cache]" Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.613771 5008 generic.go:334] "Generic (PLEG): container finished" podID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerID="16ca62d6ed1f662b6bd0ea0c5af9755fe9a957be9453f4224460900b731f6943" exitCode=137 Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.613914 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"16ca62d6ed1f662b6bd0ea0c5af9755fe9a957be9453f4224460900b731f6943"} Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.872894 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.955675 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-cache\") pod \"fcb3859a-2fc0-4479-a59d-7888246899a9\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.955730 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb3859a-2fc0-4479-a59d-7888246899a9-combined-ca-bundle\") pod \"fcb3859a-2fc0-4479-a59d-7888246899a9\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.955751 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"fcb3859a-2fc0-4479-a59d-7888246899a9\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.955768 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") pod \"fcb3859a-2fc0-4479-a59d-7888246899a9\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.955791 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nrx4\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-kube-api-access-5nrx4\") pod \"fcb3859a-2fc0-4479-a59d-7888246899a9\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.955811 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-lock\") pod \"fcb3859a-2fc0-4479-a59d-7888246899a9\" (UID: \"fcb3859a-2fc0-4479-a59d-7888246899a9\") " Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.956398 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-cache" (OuterVolumeSpecName: "cache") pod "fcb3859a-2fc0-4479-a59d-7888246899a9" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.956420 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-lock" (OuterVolumeSpecName: "lock") pod "fcb3859a-2fc0-4479-a59d-7888246899a9" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.960684 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fcb3859a-2fc0-4479-a59d-7888246899a9" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.961243 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-kube-api-access-5nrx4" (OuterVolumeSpecName: "kube-api-access-5nrx4") pod "fcb3859a-2fc0-4479-a59d-7888246899a9" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9"). InnerVolumeSpecName "kube-api-access-5nrx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:51 crc kubenswrapper[5008]: I0318 18:27:51.963710 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "fcb3859a-2fc0-4479-a59d-7888246899a9" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.056700 5008 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-cache\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.057124 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.057169 5008 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.057185 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nrx4\" (UniqueName: \"kubernetes.io/projected/fcb3859a-2fc0-4479-a59d-7888246899a9-kube-api-access-5nrx4\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.057198 5008 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fcb3859a-2fc0-4479-a59d-7888246899a9-lock\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.079607 5008 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.097746 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.104610 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x8pkm_f55031bd-9626-475f-a74f-d0e5f8ec8a66/ovs-vswitchd/0.log" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.105735 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.157991 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-etc-ovs\") pod \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.158049 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-run\") pod \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.158073 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-etc-machine-id\") pod \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.158093 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-lib\") pod \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.158095 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "f55031bd-9626-475f-a74f-d0e5f8ec8a66" (UID: "f55031bd-9626-475f-a74f-d0e5f8ec8a66"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.158116 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f55031bd-9626-475f-a74f-d0e5f8ec8a66-scripts\") pod \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.158136 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" (UID: "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.158152 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-run" (OuterVolumeSpecName: "var-run") pod "f55031bd-9626-475f-a74f-d0e5f8ec8a66" (UID: "f55031bd-9626-475f-a74f-d0e5f8ec8a66"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.158171 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-lib" (OuterVolumeSpecName: "var-lib") pod "f55031bd-9626-475f-a74f-d0e5f8ec8a66" (UID: "f55031bd-9626-475f-a74f-d0e5f8ec8a66"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.159291 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f55031bd-9626-475f-a74f-d0e5f8ec8a66-scripts" (OuterVolumeSpecName: "scripts") pod "f55031bd-9626-475f-a74f-d0e5f8ec8a66" (UID: "f55031bd-9626-475f-a74f-d0e5f8ec8a66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.159299 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-log\") pod \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.159321 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-log" (OuterVolumeSpecName: "var-log") pod "f55031bd-9626-475f-a74f-d0e5f8ec8a66" (UID: "f55031bd-9626-475f-a74f-d0e5f8ec8a66"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.159353 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data\") pod \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.159447 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-scripts\") pod \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.159474 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data-custom\") pod \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.159945 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhzsq\" (UniqueName: \"kubernetes.io/projected/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-kube-api-access-lhzsq\") pod \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.159970 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-478tc\" (UniqueName: \"kubernetes.io/projected/f55031bd-9626-475f-a74f-d0e5f8ec8a66-kube-api-access-478tc\") pod \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\" (UID: \"f55031bd-9626-475f-a74f-d0e5f8ec8a66\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.159993 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-combined-ca-bundle\") pod \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\" (UID: \"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9\") " Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.160462 5008 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.160524 5008 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.160534 5008 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.160544 5008 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-lib\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.160563 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f55031bd-9626-475f-a74f-d0e5f8ec8a66-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.160572 5008 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f55031bd-9626-475f-a74f-d0e5f8ec8a66-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.160580 5008 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.162348 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-kube-api-access-lhzsq" (OuterVolumeSpecName: "kube-api-access-lhzsq") pod "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" (UID: "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9"). InnerVolumeSpecName "kube-api-access-lhzsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.162686 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-scripts" (OuterVolumeSpecName: "scripts") pod "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" (UID: "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.163495 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" (UID: "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.164503 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f55031bd-9626-475f-a74f-d0e5f8ec8a66-kube-api-access-478tc" (OuterVolumeSpecName: "kube-api-access-478tc") pod "f55031bd-9626-475f-a74f-d0e5f8ec8a66" (UID: "f55031bd-9626-475f-a74f-d0e5f8ec8a66"). InnerVolumeSpecName "kube-api-access-478tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.191049 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" (UID: "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.201268 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb3859a-2fc0-4479-a59d-7888246899a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcb3859a-2fc0-4479-a59d-7888246899a9" (UID: "fcb3859a-2fc0-4479-a59d-7888246899a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.251021 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data" (OuterVolumeSpecName: "config-data") pod "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" (UID: "be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.262093 5008 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.262121 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb3859a-2fc0-4479-a59d-7888246899a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.262132 5008 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.262224 5008 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.262237 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhzsq\" (UniqueName: \"kubernetes.io/projected/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-kube-api-access-lhzsq\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.262248 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-478tc\" (UniqueName: \"kubernetes.io/projected/f55031bd-9626-475f-a74f-d0e5f8ec8a66-kube-api-access-478tc\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.262256 5008 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.627466 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-x8pkm_f55031bd-9626-475f-a74f-d0e5f8ec8a66/ovs-vswitchd/0.log" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.628513 5008 generic.go:334] "Generic (PLEG): container finished" podID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" exitCode=137 Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.628601 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x8pkm" event={"ID":"f55031bd-9626-475f-a74f-d0e5f8ec8a66","Type":"ContainerDied","Data":"90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f"} Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.628632 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-x8pkm" event={"ID":"f55031bd-9626-475f-a74f-d0e5f8ec8a66","Type":"ContainerDied","Data":"a175d6328c98d5d881a54a4fccffe8543c4577dce2965fc3617be2f366d880e5"} Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.628652 5008 scope.go:117] "RemoveContainer" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.628812 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-x8pkm" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.631790 5008 generic.go:334] "Generic (PLEG): container finished" podID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerID="20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236" exitCode=137 Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.631908 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9","Type":"ContainerDied","Data":"20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236"} Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.631996 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9","Type":"ContainerDied","Data":"8cdd24a0ab60621ebf8f6f65dc159f5b4b227aee6053095397ed6a70abcea49f"} Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.632130 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.645809 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fcb3859a-2fc0-4479-a59d-7888246899a9","Type":"ContainerDied","Data":"fc4790e13dff7b48eb97ee34aebdb11f419a797789fcd267c5a32795ed706da6"} Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.646239 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.665134 5008 scope.go:117] "RemoveContainer" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.665403 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-x8pkm"] Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.690963 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-x8pkm"] Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.711637 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.716506 5008 scope.go:117] "RemoveContainer" containerID="2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.716880 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.724437 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.730335 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.746804 5008 scope.go:117] "RemoveContainer" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" Mar 18 18:27:52 crc kubenswrapper[5008]: E0318 18:27:52.747499 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f\": container with ID starting with 90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f not found: ID does not exist" containerID="90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.747529 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f"} err="failed to get container status \"90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f\": rpc error: code = NotFound desc = could not find container \"90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f\": container with ID starting with 90a34eb7f5df9e190021b266015d6b7a3fc41055027e606adaec8a04b6d8070f not found: ID does not exist" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.747566 5008 scope.go:117] "RemoveContainer" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" Mar 18 18:27:52 crc kubenswrapper[5008]: E0318 18:27:52.748036 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57\": container with ID starting with 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 not found: ID does not exist" containerID="240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.748084 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57"} err="failed to get container status \"240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57\": rpc error: code = NotFound desc = could not find container \"240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57\": container with ID starting with 240d0dc619001fcf9939890fbf6c5d3943d7007479a736c1e5bdeef1870eab57 not found: ID does not exist" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.748105 5008 scope.go:117] "RemoveContainer" containerID="2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2" Mar 18 18:27:52 crc kubenswrapper[5008]: E0318 18:27:52.748627 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2\": container with ID starting with 2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2 not found: ID does not exist" containerID="2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.748647 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2"} err="failed to get container status \"2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2\": rpc error: code = NotFound desc = could not find container \"2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2\": container with ID starting with 2a851048633c8702a158c7035b8ed097c86030e31bb77b6fbb19f72283c707c2 not found: ID does not exist" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.748659 5008 scope.go:117] "RemoveContainer" containerID="2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.793983 5008 scope.go:117] "RemoveContainer" containerID="20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.819589 5008 scope.go:117] "RemoveContainer" containerID="2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941" Mar 18 18:27:52 crc kubenswrapper[5008]: E0318 18:27:52.820056 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941\": container with ID starting with 2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941 not found: ID does not exist" containerID="2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.820092 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941"} err="failed to get container status \"2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941\": rpc error: code = NotFound desc = could not find container \"2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941\": container with ID starting with 2f33719f92055074851b51599de618f808c5a4383b6fb86e503d5b01cf723941 not found: ID does not exist" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.820119 5008 scope.go:117] "RemoveContainer" containerID="20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236" Mar 18 18:27:52 crc kubenswrapper[5008]: E0318 18:27:52.820644 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236\": container with ID starting with 20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236 not found: ID does not exist" containerID="20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.820695 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236"} err="failed to get container status \"20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236\": rpc error: code = NotFound desc = could not find container \"20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236\": container with ID starting with 20f1fb66e65134755453d78cfd50a9bd3959e1e313d6a2e4fcb63644bb833236 not found: ID does not exist" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.820716 5008 scope.go:117] "RemoveContainer" containerID="16ca62d6ed1f662b6bd0ea0c5af9755fe9a957be9453f4224460900b731f6943" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.849083 5008 scope.go:117] "RemoveContainer" containerID="380cc5591873123d91f18022fd060b0a9e10c5e3b072ae816f61a2e6ad015a78" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.878571 5008 scope.go:117] "RemoveContainer" containerID="ee0fd9858e770e37fee73845e8c0a241a341746edd5488d756144d0dbce6ee7b" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.922707 5008 scope.go:117] "RemoveContainer" containerID="e00afaaa564c37f366ee4ac26eb8ca94d2c1e8b26ed42d7509ff378a29f8f96a" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.949807 5008 scope.go:117] "RemoveContainer" containerID="4f571497d968bf39f24266de4994c7de6a2c821baa3ad302407cf536047c662e" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.969436 5008 scope.go:117] "RemoveContainer" containerID="3b7732cd3cbc9f6e46f3b52e181285bcbcb64ff5a7d634bf4399f0d57729ef65" Mar 18 18:27:52 crc kubenswrapper[5008]: I0318 18:27:52.990926 5008 scope.go:117] "RemoveContainer" containerID="0b1ee5c8c45f6646ada20310701b8ec3f99b2a8128a2190acf71a6ef29f4200a" Mar 18 18:27:53 crc kubenswrapper[5008]: I0318 18:27:53.006338 5008 scope.go:117] "RemoveContainer" containerID="2a06525b664dc560a781b00430903d7869796e656f728e7637b34cc39532a99e" Mar 18 18:27:53 crc kubenswrapper[5008]: I0318 18:27:53.023667 5008 scope.go:117] "RemoveContainer" containerID="aa11f18a13f730403dae487c0f3224a3b6d6266ab6e0fc1aab36fa0cff77ecb4" Mar 18 18:27:53 crc kubenswrapper[5008]: I0318 18:27:53.039451 5008 scope.go:117] "RemoveContainer" containerID="21b72f72c110b5cddd921bb4d2588f810988fa4c91525dd72e97c92a5f5d881d" Mar 18 18:27:53 crc kubenswrapper[5008]: I0318 18:27:53.061871 5008 scope.go:117] "RemoveContainer" containerID="a3d478398fbcc00ebce85e7d90128952489a28ad02808dcc006fc3822c4fdaba" Mar 18 18:27:53 crc kubenswrapper[5008]: I0318 18:27:53.078981 5008 scope.go:117] "RemoveContainer" containerID="b5255bfa8eb99b8162ba17c46557ccb30518ff7df2b8694d473240b663a9ce8c" Mar 18 18:27:53 crc kubenswrapper[5008]: I0318 18:27:53.099394 5008 scope.go:117] "RemoveContainer" containerID="688589b78817d925eba18cc083d7aae7884af996d5eac87b2f9b8be694e1d743" Mar 18 18:27:53 crc kubenswrapper[5008]: I0318 18:27:53.120687 5008 scope.go:117] "RemoveContainer" containerID="b75695d9f9722a67c19ee08c21555a403a91fc1e836d2a6b7c94c581c39bc7e8" Mar 18 18:27:53 crc kubenswrapper[5008]: I0318 18:27:53.140523 5008 scope.go:117] "RemoveContainer" containerID="b2ff7dec8963820747dd167a23cc98ef08104fcb6886f42e0c188e8c2d2b5557" Mar 18 18:27:54 crc kubenswrapper[5008]: I0318 18:27:54.213376 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" path="/var/lib/kubelet/pods/be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9/volumes" Mar 18 18:27:54 crc kubenswrapper[5008]: I0318 18:27:54.214743 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" path="/var/lib/kubelet/pods/f55031bd-9626-475f-a74f-d0e5f8ec8a66/volumes" Mar 18 18:27:54 crc kubenswrapper[5008]: I0318 18:27:54.216607 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" path="/var/lib/kubelet/pods/fcb3859a-2fc0-4479-a59d-7888246899a9/volumes" Mar 18 18:27:54 crc kubenswrapper[5008]: I0318 18:27:54.460030 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:27:54 crc kubenswrapper[5008]: I0318 18:27:54.460095 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:27:58 crc kubenswrapper[5008]: I0318 18:27:58.460399 5008 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod16314cf6-663f-4fa9-a1e7-272c1a183b58"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod16314cf6-663f-4fa9-a1e7-272c1a183b58] : Timed out while waiting for systemd to remove kubepods-besteffort-pod16314cf6_663f_4fa9_a1e7_272c1a183b58.slice" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.163690 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564308-vdh85"] Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164021 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerName="neutron-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164036 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerName="neutron-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164053 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" containerName="ovn-controller" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164061 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" containerName="ovn-controller" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164078 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="swift-recon-cron" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164088 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="swift-recon-cron" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164099 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-metadata" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164108 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-metadata" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164119 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-replicator" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164128 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-replicator" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164144 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerName="placement-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164152 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerName="placement-log" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164167 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07bd6644-ca18-4b8d-ad83-9757257768fb" containerName="galera" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164175 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="07bd6644-ca18-4b8d-ad83-9757257768fb" containerName="galera" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164193 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-expirer" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164201 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-expirer" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164213 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8724770c-4223-4cfe-b35b-be7cd1a6a9ff" containerName="mysql-bootstrap" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164220 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8724770c-4223-4cfe-b35b-be7cd1a6a9ff" containerName="mysql-bootstrap" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164234 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-updater" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164242 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-updater" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164256 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164264 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-api" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164278 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" containerName="kube-state-metrics" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164286 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" containerName="kube-state-metrics" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164294 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerName="glance-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164302 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerName="glance-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164316 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-auditor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164324 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-auditor" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164333 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" containerName="setup-container" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164343 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" containerName="setup-container" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164357 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07bd6644-ca18-4b8d-ad83-9757257768fb" containerName="mysql-bootstrap" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164365 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="07bd6644-ca18-4b8d-ad83-9757257768fb" containerName="mysql-bootstrap" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164378 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerName="ovn-northd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164386 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerName="ovn-northd" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164396 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerName="openstack-network-exporter" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164405 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerName="openstack-network-exporter" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164416 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8724770c-4223-4cfe-b35b-be7cd1a6a9ff" containerName="galera" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164425 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8724770c-4223-4cfe-b35b-be7cd1a6a9ff" containerName="galera" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164439 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerName="probe" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164447 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerName="probe" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164455 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="proxy-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164463 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="proxy-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164475 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerName="neutron-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164483 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerName="neutron-api" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164498 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-server" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164506 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-server" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164522 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" containerName="cinder-api-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164530 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" containerName="cinder-api-log" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164544 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd462bb4-44f5-4e0f-bc17-53d24604d474" containerName="nova-cell1-conductor-conductor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164575 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd462bb4-44f5-4e0f-bc17-53d24604d474" containerName="nova-cell1-conductor-conductor" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164585 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164595 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164605 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerName="barbican-keystone-listener" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164613 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerName="barbican-keystone-listener" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164622 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" containerName="cinder-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164630 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" containerName="cinder-api" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164645 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd78c73-6590-4035-af7d-357b8451f0ad" containerName="memcached" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164653 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd78c73-6590-4035-af7d-357b8451f0ad" containerName="memcached" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164663 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164670 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-log" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164681 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" containerName="rabbitmq" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164689 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" containerName="rabbitmq" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164700 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-server" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164708 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-server" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164720 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164728 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164740 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="sg-core" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164747 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="sg-core" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164764 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60d757b-db66-46c1-ad92-4a9e591217a0" containerName="rabbitmq" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164775 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60d757b-db66-46c1-ad92-4a9e591217a0" containerName="rabbitmq" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164793 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-updater" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164804 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-updater" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164821 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164829 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164845 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerName="glance-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164855 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerName="glance-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164867 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-reaper" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164878 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-reaper" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164897 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed55404d-2d05-4776-abed-7579ae87933d" containerName="nova-cell0-conductor-conductor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164908 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed55404d-2d05-4776-abed-7579ae87933d" containerName="nova-cell0-conductor-conductor" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164928 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerName="glance-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164935 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerName="glance-log" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164949 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164957 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-log" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.164972 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-auditor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.164983 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-auditor" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165001 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerName="barbican-keystone-listener-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165012 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerName="barbican-keystone-listener-log" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165029 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerName="placement-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165037 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerName="placement-api" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165071 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-server" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165082 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-server" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165097 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26207e6-102f-4160-be7d-e1cad865fcc6" containerName="nova-scheduler-scheduler" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165108 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26207e6-102f-4160-be7d-e1cad865fcc6" containerName="nova-scheduler-scheduler" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165121 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="ceilometer-central-agent" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165131 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="ceilometer-central-agent" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165145 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server-init" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165155 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server-init" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165172 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60d757b-db66-46c1-ad92-4a9e591217a0" containerName="setup-container" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165180 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60d757b-db66-46c1-ad92-4a9e591217a0" containerName="setup-container" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165190 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab5f625-144a-4c7c-bab8-5399de3b5a8e" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165199 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab5f625-144a-4c7c-bab8-5399de3b5a8e" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165210 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165218 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api-log" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165233 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerName="glance-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165241 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerName="glance-log" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165255 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-replicator" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165263 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-replicator" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165272 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerName="cinder-scheduler" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165280 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerName="cinder-scheduler" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165296 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="rsync" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165305 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="rsync" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165313 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16314cf6-663f-4fa9-a1e7-272c1a183b58" containerName="keystone-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165321 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="16314cf6-663f-4fa9-a1e7-272c1a183b58" containerName="keystone-api" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165334 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerName="barbican-worker-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165343 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerName="barbican-worker-log" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165356 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-replicator" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165364 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-replicator" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165378 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="ceilometer-notification-agent" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165387 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="ceilometer-notification-agent" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165404 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerName="barbican-worker" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165414 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerName="barbican-worker" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.165425 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-auditor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165435 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-auditor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165617 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovsdb-server" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165638 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerName="barbican-keystone-listener-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165657 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165667 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerName="probe" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165675 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec23ac4-7c86-4e9e-96ba-e4ccc406fd84" containerName="kube-state-metrics" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165684 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab5f625-144a-4c7c-bab8-5399de3b5a8e" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165701 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed55404d-2d05-4776-abed-7579ae87933d" containerName="nova-cell0-conductor-conductor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165713 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-metadata" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165722 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa3cc5e4-3fd1-48ea-a992-a2b5e76f183c" containerName="ovn-controller" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165737 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="rsync" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165751 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-replicator" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165765 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerName="ovn-northd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165775 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-auditor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165786 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26207e6-102f-4160-be7d-e1cad865fcc6" containerName="nova-scheduler-scheduler" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165798 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerName="glance-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165809 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="582dafe2-2020-4966-921d-cc5e9f0db46c" containerName="glance-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165820 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerName="barbican-worker" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165831 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda3600a-d612-43ec-8b45-77eccc420b0f" containerName="nova-api-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165840 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="swift-recon-cron" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165850 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165862 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a03e07-237e-4583-81b4-8d9aadc76ea3" containerName="barbican-keystone-listener" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165871 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd78c73-6590-4035-af7d-357b8451f0ad" containerName="memcached" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165881 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-server" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165892 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerName="glance-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165903 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="ceilometer-central-agent" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165914 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="sg-core" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165927 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="proxy-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165938 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="07bd6644-ca18-4b8d-ad83-9757257768fb" containerName="galera" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165951 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="16314cf6-663f-4fa9-a1e7-272c1a183b58" containerName="keystone-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165963 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd462bb4-44f5-4e0f-bc17-53d24604d474" containerName="nova-cell1-conductor-conductor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165974 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67f3431-0e44-4d3c-8aa9-0f3fb176387d" containerName="barbican-worker-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165984 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8724770c-4223-4cfe-b35b-be7cd1a6a9ff" containerName="galera" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.165998 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerName="placement-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166006 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="96efea0e-17ae-49c4-8f5c-b7341def6878" containerName="nova-metadata-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166015 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="be2e8e14-ab44-4c2b-98b1-e17ea1e22ce9" containerName="cinder-scheduler" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166024 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f0e04a-507a-42ba-97ff-d91aa199b3db" containerName="ceilometer-notification-agent" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166034 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f873fe5-8163-4b6d-8e6d-3a60914c1a3b" containerName="placement-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166048 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-expirer" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166058 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-server" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166067 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1c2fc8-83c6-4183-ac62-f23ad5db8610" containerName="openstack-network-exporter" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166076 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-replicator" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166087 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8cadfb-82b5-4427-966d-c3e5bf2a85ad" containerName="barbican-api-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166098 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60d757b-db66-46c1-ad92-4a9e591217a0" containerName="rabbitmq" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166109 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-auditor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166117 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerName="neutron-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166127 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-server" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166138 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-replicator" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166149 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="object-updater" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166162 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55031bd-9626-475f-a74f-d0e5f8ec8a66" containerName="ovs-vswitchd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166173 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9299b1-8e15-4e9c-bada-ce88af9c1c28" containerName="neutron-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166184 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" containerName="cinder-api-log" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166194 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-updater" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166205 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d27fb392-40df-45a9-aeae-20781d90f02b" containerName="cinder-api" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166217 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8679cebf-8eea-45ae-be70-26eea9396f8e" containerName="glance-httpd" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166229 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5f0191-2702-46ed-ab82-e8c93ec1cf02" containerName="rabbitmq" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166242 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="account-reaper" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166252 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb3859a-2fc0-4479-a59d-7888246899a9" containerName="container-auditor" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.166858 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-vdh85" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.170452 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-vdh85"] Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.173799 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.174504 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.174708 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.268119 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnzz4\" (UniqueName: \"kubernetes.io/projected/cc5fc756-c281-47a2-bc46-1fbb55f5677c-kube-api-access-jnzz4\") pod \"auto-csr-approver-29564308-vdh85\" (UID: \"cc5fc756-c281-47a2-bc46-1fbb55f5677c\") " pod="openshift-infra/auto-csr-approver-29564308-vdh85" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.369930 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnzz4\" (UniqueName: \"kubernetes.io/projected/cc5fc756-c281-47a2-bc46-1fbb55f5677c-kube-api-access-jnzz4\") pod \"auto-csr-approver-29564308-vdh85\" (UID: \"cc5fc756-c281-47a2-bc46-1fbb55f5677c\") " pod="openshift-infra/auto-csr-approver-29564308-vdh85" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.396735 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnzz4\" (UniqueName: \"kubernetes.io/projected/cc5fc756-c281-47a2-bc46-1fbb55f5677c-kube-api-access-jnzz4\") pod \"auto-csr-approver-29564308-vdh85\" (UID: \"cc5fc756-c281-47a2-bc46-1fbb55f5677c\") " pod="openshift-infra/auto-csr-approver-29564308-vdh85" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.518036 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-vdh85" Mar 18 18:28:00 crc kubenswrapper[5008]: E0318 18:28:00.886918 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb8b8df_bedb_4e35_b709_25e83be00470.slice/crio-127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538\": RecentStats: unable to find data in memory cache]" Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.914988 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-vdh85"] Mar 18 18:28:00 crc kubenswrapper[5008]: I0318 18:28:00.927433 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:28:01 crc kubenswrapper[5008]: I0318 18:28:01.750942 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-vdh85" event={"ID":"cc5fc756-c281-47a2-bc46-1fbb55f5677c","Type":"ContainerStarted","Data":"54ee77b5d8eb34db95bb82408889a7d9f03f61eb5ed03fbb5c53cdf86b2793e5"} Mar 18 18:28:02 crc kubenswrapper[5008]: I0318 18:28:02.764694 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-vdh85" event={"ID":"cc5fc756-c281-47a2-bc46-1fbb55f5677c","Type":"ContainerStarted","Data":"2d17ffe491997283a42141fddc12fbcd7818d363abf1384a438224fc59e31093"} Mar 18 18:28:03 crc kubenswrapper[5008]: I0318 18:28:03.817462 5008 generic.go:334] "Generic (PLEG): container finished" podID="cc5fc756-c281-47a2-bc46-1fbb55f5677c" containerID="2d17ffe491997283a42141fddc12fbcd7818d363abf1384a438224fc59e31093" exitCode=0 Mar 18 18:28:03 crc kubenswrapper[5008]: I0318 18:28:03.817811 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-vdh85" event={"ID":"cc5fc756-c281-47a2-bc46-1fbb55f5677c","Type":"ContainerDied","Data":"2d17ffe491997283a42141fddc12fbcd7818d363abf1384a438224fc59e31093"} Mar 18 18:28:04 crc kubenswrapper[5008]: I0318 18:28:04.124314 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-vdh85" Mar 18 18:28:04 crc kubenswrapper[5008]: I0318 18:28:04.251726 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnzz4\" (UniqueName: \"kubernetes.io/projected/cc5fc756-c281-47a2-bc46-1fbb55f5677c-kube-api-access-jnzz4\") pod \"cc5fc756-c281-47a2-bc46-1fbb55f5677c\" (UID: \"cc5fc756-c281-47a2-bc46-1fbb55f5677c\") " Mar 18 18:28:04 crc kubenswrapper[5008]: I0318 18:28:04.259526 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc5fc756-c281-47a2-bc46-1fbb55f5677c-kube-api-access-jnzz4" (OuterVolumeSpecName: "kube-api-access-jnzz4") pod "cc5fc756-c281-47a2-bc46-1fbb55f5677c" (UID: "cc5fc756-c281-47a2-bc46-1fbb55f5677c"). InnerVolumeSpecName "kube-api-access-jnzz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:28:04 crc kubenswrapper[5008]: I0318 18:28:04.353645 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnzz4\" (UniqueName: \"kubernetes.io/projected/cc5fc756-c281-47a2-bc46-1fbb55f5677c-kube-api-access-jnzz4\") on node \"crc\" DevicePath \"\"" Mar 18 18:28:04 crc kubenswrapper[5008]: I0318 18:28:04.828377 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564308-vdh85" event={"ID":"cc5fc756-c281-47a2-bc46-1fbb55f5677c","Type":"ContainerDied","Data":"54ee77b5d8eb34db95bb82408889a7d9f03f61eb5ed03fbb5c53cdf86b2793e5"} Mar 18 18:28:04 crc kubenswrapper[5008]: I0318 18:28:04.828733 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54ee77b5d8eb34db95bb82408889a7d9f03f61eb5ed03fbb5c53cdf86b2793e5" Mar 18 18:28:04 crc kubenswrapper[5008]: I0318 18:28:04.828489 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564308-vdh85" Mar 18 18:28:05 crc kubenswrapper[5008]: I0318 18:28:05.201902 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-ht8rz"] Mar 18 18:28:05 crc kubenswrapper[5008]: I0318 18:28:05.206789 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564302-ht8rz"] Mar 18 18:28:06 crc kubenswrapper[5008]: I0318 18:28:06.209468 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236a7691-d006-4de5-bd0e-36d58d0e02b4" path="/var/lib/kubelet/pods/236a7691-d006-4de5-bd0e-36d58d0e02b4/volumes" Mar 18 18:28:11 crc kubenswrapper[5008]: E0318 18:28:11.106463 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb8b8df_bedb_4e35_b709_25e83be00470.slice/crio-127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538\": RecentStats: unable to find data in memory cache]" Mar 18 18:28:21 crc kubenswrapper[5008]: E0318 18:28:21.313082 5008 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb8b8df_bedb_4e35_b709_25e83be00470.slice/crio-127d2c079fbd21e4853737582a34391032b25f34b9809690c073c3a085dcc538\": RecentStats: unable to find data in memory cache]" Mar 18 18:28:24 crc kubenswrapper[5008]: I0318 18:28:24.460347 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:28:24 crc kubenswrapper[5008]: I0318 18:28:24.460936 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:28:24 crc kubenswrapper[5008]: I0318 18:28:24.461036 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:28:24 crc kubenswrapper[5008]: I0318 18:28:24.462189 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:28:24 crc kubenswrapper[5008]: I0318 18:28:24.462390 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" gracePeriod=600 Mar 18 18:28:24 crc kubenswrapper[5008]: E0318 18:28:24.585432 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:28:25 crc kubenswrapper[5008]: I0318 18:28:25.012867 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" exitCode=0 Mar 18 18:28:25 crc kubenswrapper[5008]: I0318 18:28:25.012933 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241"} Mar 18 18:28:25 crc kubenswrapper[5008]: I0318 18:28:25.012986 5008 scope.go:117] "RemoveContainer" containerID="b61732c3f8965875c6dd9c25b3aae8cc8d81ecff790f1af827da9944801bd467" Mar 18 18:28:25 crc kubenswrapper[5008]: I0318 18:28:25.014864 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:28:25 crc kubenswrapper[5008]: E0318 18:28:25.015678 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:28:38 crc kubenswrapper[5008]: I0318 18:28:38.287339 5008 scope.go:117] "RemoveContainer" containerID="798be6ece7d6cf52f11cd5de0dcd46ad44365eb13feba7934e2b9d594db9fe52" Mar 18 18:28:38 crc kubenswrapper[5008]: I0318 18:28:38.311787 5008 scope.go:117] "RemoveContainer" containerID="b19a7df1f361dc1f5f1167306b019694b44a326d6b04be0b9566e13afd0d3b22" Mar 18 18:28:38 crc kubenswrapper[5008]: I0318 18:28:38.335895 5008 scope.go:117] "RemoveContainer" containerID="653fc6fe31e948d5c8417ebf79632448d2a73049f8cf689fc312108d8064ed57" Mar 18 18:28:38 crc kubenswrapper[5008]: I0318 18:28:38.379687 5008 scope.go:117] "RemoveContainer" containerID="321d03f647f7f12ac4e9aa811c21fba6a9c727553aff1c409ce57b0b62f6e45a" Mar 18 18:28:38 crc kubenswrapper[5008]: I0318 18:28:38.409136 5008 scope.go:117] "RemoveContainer" containerID="5bdab6b60b219776cab7681033a182c7213dc8d2bd5bf154ccb49793888a5a7a" Mar 18 18:28:38 crc kubenswrapper[5008]: I0318 18:28:38.442932 5008 scope.go:117] "RemoveContainer" containerID="aa4c03c5ba72d29132d0cc211fd855cde6fba56b8c3be090f1c3efa9bd536ac1" Mar 18 18:28:38 crc kubenswrapper[5008]: I0318 18:28:38.472259 5008 scope.go:117] "RemoveContainer" containerID="43680a431967dbc17226617faa158bead9bd647ac537a3bcf95125106214b4b9" Mar 18 18:28:38 crc kubenswrapper[5008]: I0318 18:28:38.495352 5008 scope.go:117] "RemoveContainer" containerID="be610e0953ab45ee488b558d8bb0e6d94fda507475baed0a207f7d1beaa99d7f" Mar 18 18:28:38 crc kubenswrapper[5008]: I0318 18:28:38.510169 5008 scope.go:117] "RemoveContainer" containerID="a3defe79165feeaba9c65d446da97fc8a83798ed98fa4748fbf2caaeb30f67d5" Mar 18 18:28:40 crc kubenswrapper[5008]: I0318 18:28:40.199242 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:28:40 crc kubenswrapper[5008]: E0318 18:28:40.199765 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.477867 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-24cbz"] Mar 18 18:28:47 crc kubenswrapper[5008]: E0318 18:28:47.478785 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5fc756-c281-47a2-bc46-1fbb55f5677c" containerName="oc" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.478802 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5fc756-c281-47a2-bc46-1fbb55f5677c" containerName="oc" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.479018 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc5fc756-c281-47a2-bc46-1fbb55f5677c" containerName="oc" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.480155 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.499272 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-24cbz"] Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.549800 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-catalog-content\") pod \"certified-operators-24cbz\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.549891 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgtg8\" (UniqueName: \"kubernetes.io/projected/4276ae23-d1c9-4976-a976-5ca14049dbaf-kube-api-access-sgtg8\") pod \"certified-operators-24cbz\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.549946 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-utilities\") pod \"certified-operators-24cbz\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.651185 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgtg8\" (UniqueName: \"kubernetes.io/projected/4276ae23-d1c9-4976-a976-5ca14049dbaf-kube-api-access-sgtg8\") pod \"certified-operators-24cbz\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.651260 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-utilities\") pod \"certified-operators-24cbz\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.651307 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-catalog-content\") pod \"certified-operators-24cbz\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.651872 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-utilities\") pod \"certified-operators-24cbz\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.651896 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-catalog-content\") pod \"certified-operators-24cbz\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.672872 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgtg8\" (UniqueName: \"kubernetes.io/projected/4276ae23-d1c9-4976-a976-5ca14049dbaf-kube-api-access-sgtg8\") pod \"certified-operators-24cbz\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:47 crc kubenswrapper[5008]: I0318 18:28:47.795577 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:48 crc kubenswrapper[5008]: I0318 18:28:48.290825 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-24cbz"] Mar 18 18:28:49 crc kubenswrapper[5008]: I0318 18:28:49.233059 5008 generic.go:334] "Generic (PLEG): container finished" podID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerID="784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899" exitCode=0 Mar 18 18:28:49 crc kubenswrapper[5008]: I0318 18:28:49.233147 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24cbz" event={"ID":"4276ae23-d1c9-4976-a976-5ca14049dbaf","Type":"ContainerDied","Data":"784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899"} Mar 18 18:28:49 crc kubenswrapper[5008]: I0318 18:28:49.233492 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24cbz" event={"ID":"4276ae23-d1c9-4976-a976-5ca14049dbaf","Type":"ContainerStarted","Data":"c441ff0f24c0e5694dec757ab884bffa73f80b34306311aea933777e0287fe19"} Mar 18 18:28:51 crc kubenswrapper[5008]: I0318 18:28:51.255025 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24cbz" event={"ID":"4276ae23-d1c9-4976-a976-5ca14049dbaf","Type":"ContainerStarted","Data":"5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184"} Mar 18 18:28:52 crc kubenswrapper[5008]: I0318 18:28:52.198924 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:28:52 crc kubenswrapper[5008]: E0318 18:28:52.199424 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:28:52 crc kubenswrapper[5008]: I0318 18:28:52.278061 5008 generic.go:334] "Generic (PLEG): container finished" podID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerID="5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184" exitCode=0 Mar 18 18:28:52 crc kubenswrapper[5008]: I0318 18:28:52.278138 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24cbz" event={"ID":"4276ae23-d1c9-4976-a976-5ca14049dbaf","Type":"ContainerDied","Data":"5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184"} Mar 18 18:28:53 crc kubenswrapper[5008]: I0318 18:28:53.287795 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24cbz" event={"ID":"4276ae23-d1c9-4976-a976-5ca14049dbaf","Type":"ContainerStarted","Data":"ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81"} Mar 18 18:28:53 crc kubenswrapper[5008]: I0318 18:28:53.313169 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-24cbz" podStartSLOduration=2.7772268110000002 podStartE2EDuration="6.313149791s" podCreationTimestamp="2026-03-18 18:28:47 +0000 UTC" firstStartedPulling="2026-03-18 18:28:49.234928982 +0000 UTC m=+1585.754402071" lastFinishedPulling="2026-03-18 18:28:52.770851882 +0000 UTC m=+1589.290325051" observedRunningTime="2026-03-18 18:28:53.309889235 +0000 UTC m=+1589.829362334" watchObservedRunningTime="2026-03-18 18:28:53.313149791 +0000 UTC m=+1589.832622860" Mar 18 18:28:57 crc kubenswrapper[5008]: I0318 18:28:57.796778 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:57 crc kubenswrapper[5008]: I0318 18:28:57.797222 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:57 crc kubenswrapper[5008]: I0318 18:28:57.848455 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:58 crc kubenswrapper[5008]: I0318 18:28:58.386541 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:28:58 crc kubenswrapper[5008]: I0318 18:28:58.441013 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-24cbz"] Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.341991 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-24cbz" podUID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerName="registry-server" containerID="cri-o://ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81" gracePeriod=2 Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.764871 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.853308 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-utilities\") pod \"4276ae23-d1c9-4976-a976-5ca14049dbaf\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.853393 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgtg8\" (UniqueName: \"kubernetes.io/projected/4276ae23-d1c9-4976-a976-5ca14049dbaf-kube-api-access-sgtg8\") pod \"4276ae23-d1c9-4976-a976-5ca14049dbaf\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.853420 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-catalog-content\") pod \"4276ae23-d1c9-4976-a976-5ca14049dbaf\" (UID: \"4276ae23-d1c9-4976-a976-5ca14049dbaf\") " Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.854695 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-utilities" (OuterVolumeSpecName: "utilities") pod "4276ae23-d1c9-4976-a976-5ca14049dbaf" (UID: "4276ae23-d1c9-4976-a976-5ca14049dbaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.864802 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4276ae23-d1c9-4976-a976-5ca14049dbaf-kube-api-access-sgtg8" (OuterVolumeSpecName: "kube-api-access-sgtg8") pod "4276ae23-d1c9-4976-a976-5ca14049dbaf" (UID: "4276ae23-d1c9-4976-a976-5ca14049dbaf"). InnerVolumeSpecName "kube-api-access-sgtg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.920308 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4276ae23-d1c9-4976-a976-5ca14049dbaf" (UID: "4276ae23-d1c9-4976-a976-5ca14049dbaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.955848 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.955927 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgtg8\" (UniqueName: \"kubernetes.io/projected/4276ae23-d1c9-4976-a976-5ca14049dbaf-kube-api-access-sgtg8\") on node \"crc\" DevicePath \"\"" Mar 18 18:29:00 crc kubenswrapper[5008]: I0318 18:29:00.955953 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4276ae23-d1c9-4976-a976-5ca14049dbaf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.357621 5008 generic.go:334] "Generic (PLEG): container finished" podID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerID="ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81" exitCode=0 Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.357703 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-24cbz" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.357729 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24cbz" event={"ID":"4276ae23-d1c9-4976-a976-5ca14049dbaf","Type":"ContainerDied","Data":"ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81"} Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.358120 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-24cbz" event={"ID":"4276ae23-d1c9-4976-a976-5ca14049dbaf","Type":"ContainerDied","Data":"c441ff0f24c0e5694dec757ab884bffa73f80b34306311aea933777e0287fe19"} Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.358202 5008 scope.go:117] "RemoveContainer" containerID="ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.398300 5008 scope.go:117] "RemoveContainer" containerID="5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.399170 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-24cbz"] Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.405272 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-24cbz"] Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.422153 5008 scope.go:117] "RemoveContainer" containerID="784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.456802 5008 scope.go:117] "RemoveContainer" containerID="ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81" Mar 18 18:29:01 crc kubenswrapper[5008]: E0318 18:29:01.457267 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81\": container with ID starting with ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81 not found: ID does not exist" containerID="ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.457296 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81"} err="failed to get container status \"ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81\": rpc error: code = NotFound desc = could not find container \"ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81\": container with ID starting with ff958d04f13857a23e5a8c3c0fb40f73caae27d2969d615c47c7e3dd58120d81 not found: ID does not exist" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.457315 5008 scope.go:117] "RemoveContainer" containerID="5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184" Mar 18 18:29:01 crc kubenswrapper[5008]: E0318 18:29:01.457857 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184\": container with ID starting with 5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184 not found: ID does not exist" containerID="5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.457878 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184"} err="failed to get container status \"5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184\": rpc error: code = NotFound desc = could not find container \"5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184\": container with ID starting with 5168ce24ef45bda8d27b93fc95762aca3eb3a1336583725f5bc5fdba43266184 not found: ID does not exist" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.457893 5008 scope.go:117] "RemoveContainer" containerID="784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899" Mar 18 18:29:01 crc kubenswrapper[5008]: E0318 18:29:01.458319 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899\": container with ID starting with 784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899 not found: ID does not exist" containerID="784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899" Mar 18 18:29:01 crc kubenswrapper[5008]: I0318 18:29:01.458337 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899"} err="failed to get container status \"784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899\": rpc error: code = NotFound desc = could not find container \"784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899\": container with ID starting with 784e5b88f0a959b94dcbf9a10da49b82df1c924969ab54f1a561ded8b4d3a899 not found: ID does not exist" Mar 18 18:29:02 crc kubenswrapper[5008]: I0318 18:29:02.218006 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4276ae23-d1c9-4976-a976-5ca14049dbaf" path="/var/lib/kubelet/pods/4276ae23-d1c9-4976-a976-5ca14049dbaf/volumes" Mar 18 18:29:04 crc kubenswrapper[5008]: I0318 18:29:04.204645 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:29:04 crc kubenswrapper[5008]: E0318 18:29:04.205179 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:29:16 crc kubenswrapper[5008]: I0318 18:29:16.198407 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:29:16 crc kubenswrapper[5008]: E0318 18:29:16.199296 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:29:28 crc kubenswrapper[5008]: I0318 18:29:28.198398 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:29:28 crc kubenswrapper[5008]: E0318 18:29:28.199184 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:29:38 crc kubenswrapper[5008]: I0318 18:29:38.710402 5008 scope.go:117] "RemoveContainer" containerID="f3b95184a7984c98dbe7c421b1a76f5828bb070cd349c4211c466713abbb44bd" Mar 18 18:29:38 crc kubenswrapper[5008]: I0318 18:29:38.742914 5008 scope.go:117] "RemoveContainer" containerID="4f493c0654ac13a7e1955bddbb740f2d66692da8d367b003d1337865c6e92adf" Mar 18 18:29:38 crc kubenswrapper[5008]: I0318 18:29:38.786717 5008 scope.go:117] "RemoveContainer" containerID="081f3bead1fb53eb0fef72bd4a02d8ed9f8351a7fc981808236fa647e89e8a7d" Mar 18 18:29:38 crc kubenswrapper[5008]: I0318 18:29:38.847023 5008 scope.go:117] "RemoveContainer" containerID="301939a383694f0855d653da76c5187011b8a736256c7435c14a333754982b7e" Mar 18 18:29:38 crc kubenswrapper[5008]: I0318 18:29:38.889511 5008 scope.go:117] "RemoveContainer" containerID="4047e4f55dd1036cd59241e4a458eb989331ec47896b35605a25dca184e1b9f0" Mar 18 18:29:38 crc kubenswrapper[5008]: I0318 18:29:38.927931 5008 scope.go:117] "RemoveContainer" containerID="b2bee2c950af6264c90593b94283891f75fe0f285363f33017f08c83c4819718" Mar 18 18:29:38 crc kubenswrapper[5008]: I0318 18:29:38.948295 5008 scope.go:117] "RemoveContainer" containerID="691a4cffba40335c2f67b7e5844c2b3a8e32108a6a47bfa2c66e10562fd31321" Mar 18 18:29:38 crc kubenswrapper[5008]: I0318 18:29:38.972658 5008 scope.go:117] "RemoveContainer" containerID="c5ebbf7fdaca2200cb1d9823b9896d5673972b9ae440af8084db85d94e9e1a85" Mar 18 18:29:38 crc kubenswrapper[5008]: I0318 18:29:38.989535 5008 scope.go:117] "RemoveContainer" containerID="358a921eea98df73bcc1f5aad28a595444fa783bab3b83f0782f3fbeada80880" Mar 18 18:29:39 crc kubenswrapper[5008]: I0318 18:29:39.015319 5008 scope.go:117] "RemoveContainer" containerID="583ea8f91c610b42793eda14286b85f2452817e74199e6c8c7a0515bdd26747e" Mar 18 18:29:39 crc kubenswrapper[5008]: I0318 18:29:39.061046 5008 scope.go:117] "RemoveContainer" containerID="16c0e14b3ff7895a9c2c9c4b897294a5ac7ce1bd99ff7ff4b5ed6bef900c7565" Mar 18 18:29:39 crc kubenswrapper[5008]: I0318 18:29:39.083328 5008 scope.go:117] "RemoveContainer" containerID="b170adfe134c9b486f05f7ade0111e4b3038c407224af9fab38d1f5b42aacb25" Mar 18 18:29:39 crc kubenswrapper[5008]: I0318 18:29:39.110818 5008 scope.go:117] "RemoveContainer" containerID="fd8acbbbdc07719b3fd9789e70de5b0c19e9336a0bf5e6a2e0e491ef32ac7c89" Mar 18 18:29:39 crc kubenswrapper[5008]: I0318 18:29:39.160911 5008 scope.go:117] "RemoveContainer" containerID="38c35231cb4f965f24f273cdc08bc89c2aee65ecd48f5c1502dae6903a1aea69" Mar 18 18:29:39 crc kubenswrapper[5008]: I0318 18:29:39.179335 5008 scope.go:117] "RemoveContainer" containerID="47effa833330afdff41754f56f12e67efb2dfaec92e6efa219903a6b7c1fe169" Mar 18 18:29:39 crc kubenswrapper[5008]: I0318 18:29:39.199073 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:29:39 crc kubenswrapper[5008]: E0318 18:29:39.199391 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.744708 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hkgq8"] Mar 18 18:29:47 crc kubenswrapper[5008]: E0318 18:29:47.745674 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerName="registry-server" Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.745707 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerName="registry-server" Mar 18 18:29:47 crc kubenswrapper[5008]: E0318 18:29:47.745740 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerName="extract-utilities" Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.745752 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerName="extract-utilities" Mar 18 18:29:47 crc kubenswrapper[5008]: E0318 18:29:47.745786 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerName="extract-content" Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.745796 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerName="extract-content" Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.746026 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4276ae23-d1c9-4976-a976-5ca14049dbaf" containerName="registry-server" Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.747508 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.759869 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkgq8"] Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.904515 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-utilities\") pod \"redhat-marketplace-hkgq8\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.904799 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-catalog-content\") pod \"redhat-marketplace-hkgq8\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:47 crc kubenswrapper[5008]: I0318 18:29:47.905065 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfm7j\" (UniqueName: \"kubernetes.io/projected/6fb89930-99dd-4d82-9982-adf3b3aa39cf-kube-api-access-hfm7j\") pod \"redhat-marketplace-hkgq8\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:48 crc kubenswrapper[5008]: I0318 18:29:48.006397 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfm7j\" (UniqueName: \"kubernetes.io/projected/6fb89930-99dd-4d82-9982-adf3b3aa39cf-kube-api-access-hfm7j\") pod \"redhat-marketplace-hkgq8\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:48 crc kubenswrapper[5008]: I0318 18:29:48.006524 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-utilities\") pod \"redhat-marketplace-hkgq8\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:48 crc kubenswrapper[5008]: I0318 18:29:48.006604 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-catalog-content\") pod \"redhat-marketplace-hkgq8\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:48 crc kubenswrapper[5008]: I0318 18:29:48.006974 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-utilities\") pod \"redhat-marketplace-hkgq8\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:48 crc kubenswrapper[5008]: I0318 18:29:48.007058 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-catalog-content\") pod \"redhat-marketplace-hkgq8\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:48 crc kubenswrapper[5008]: I0318 18:29:48.034374 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfm7j\" (UniqueName: \"kubernetes.io/projected/6fb89930-99dd-4d82-9982-adf3b3aa39cf-kube-api-access-hfm7j\") pod \"redhat-marketplace-hkgq8\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:48 crc kubenswrapper[5008]: I0318 18:29:48.122851 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:48 crc kubenswrapper[5008]: I0318 18:29:48.587793 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkgq8"] Mar 18 18:29:49 crc kubenswrapper[5008]: I0318 18:29:49.041668 5008 generic.go:334] "Generic (PLEG): container finished" podID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerID="5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f" exitCode=0 Mar 18 18:29:49 crc kubenswrapper[5008]: I0318 18:29:49.041771 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkgq8" event={"ID":"6fb89930-99dd-4d82-9982-adf3b3aa39cf","Type":"ContainerDied","Data":"5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f"} Mar 18 18:29:49 crc kubenswrapper[5008]: I0318 18:29:49.042105 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkgq8" event={"ID":"6fb89930-99dd-4d82-9982-adf3b3aa39cf","Type":"ContainerStarted","Data":"067d5fcaecd49dfad4e2c50f1b126ffe54c5db856ca8a297e99c648b62d380c7"} Mar 18 18:29:51 crc kubenswrapper[5008]: I0318 18:29:51.066392 5008 generic.go:334] "Generic (PLEG): container finished" podID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerID="5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec" exitCode=0 Mar 18 18:29:51 crc kubenswrapper[5008]: I0318 18:29:51.066510 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkgq8" event={"ID":"6fb89930-99dd-4d82-9982-adf3b3aa39cf","Type":"ContainerDied","Data":"5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec"} Mar 18 18:29:51 crc kubenswrapper[5008]: I0318 18:29:51.199004 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:29:51 crc kubenswrapper[5008]: E0318 18:29:51.199432 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:29:52 crc kubenswrapper[5008]: I0318 18:29:52.078440 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkgq8" event={"ID":"6fb89930-99dd-4d82-9982-adf3b3aa39cf","Type":"ContainerStarted","Data":"9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df"} Mar 18 18:29:52 crc kubenswrapper[5008]: I0318 18:29:52.116467 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hkgq8" podStartSLOduration=2.613500541 podStartE2EDuration="5.116447059s" podCreationTimestamp="2026-03-18 18:29:47 +0000 UTC" firstStartedPulling="2026-03-18 18:29:49.043940398 +0000 UTC m=+1645.563413477" lastFinishedPulling="2026-03-18 18:29:51.546886916 +0000 UTC m=+1648.066359995" observedRunningTime="2026-03-18 18:29:52.111096677 +0000 UTC m=+1648.630569756" watchObservedRunningTime="2026-03-18 18:29:52.116447059 +0000 UTC m=+1648.635920148" Mar 18 18:29:58 crc kubenswrapper[5008]: I0318 18:29:58.123937 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:58 crc kubenswrapper[5008]: I0318 18:29:58.125023 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:58 crc kubenswrapper[5008]: I0318 18:29:58.169341 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:58 crc kubenswrapper[5008]: I0318 18:29:58.227888 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:29:58 crc kubenswrapper[5008]: I0318 18:29:58.416350 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkgq8"] Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.160228 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564310-gkp7r"] Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.161701 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-gkp7r" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.162176 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hkgq8" podUID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerName="registry-server" containerID="cri-o://9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df" gracePeriod=2 Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.166222 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.166300 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.169497 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.187625 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-gkp7r"] Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.259447 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5"] Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.260264 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.265415 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.266031 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.272352 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5"] Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.306903 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd658\" (UniqueName: \"kubernetes.io/projected/e689aa22-89d5-442d-8473-6bf84b521b85-kube-api-access-jd658\") pod \"auto-csr-approver-29564310-gkp7r\" (UID: \"e689aa22-89d5-442d-8473-6bf84b521b85\") " pod="openshift-infra/auto-csr-approver-29564310-gkp7r" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.408141 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-config-volume\") pod \"collect-profiles-29564310-fg5t5\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.408232 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd658\" (UniqueName: \"kubernetes.io/projected/e689aa22-89d5-442d-8473-6bf84b521b85-kube-api-access-jd658\") pod \"auto-csr-approver-29564310-gkp7r\" (UID: \"e689aa22-89d5-442d-8473-6bf84b521b85\") " pod="openshift-infra/auto-csr-approver-29564310-gkp7r" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.408262 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbtx\" (UniqueName: \"kubernetes.io/projected/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-kube-api-access-pdbtx\") pod \"collect-profiles-29564310-fg5t5\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.408316 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-secret-volume\") pod \"collect-profiles-29564310-fg5t5\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.431955 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd658\" (UniqueName: \"kubernetes.io/projected/e689aa22-89d5-442d-8473-6bf84b521b85-kube-api-access-jd658\") pod \"auto-csr-approver-29564310-gkp7r\" (UID: \"e689aa22-89d5-442d-8473-6bf84b521b85\") " pod="openshift-infra/auto-csr-approver-29564310-gkp7r" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.489464 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-gkp7r" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.512574 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-secret-volume\") pod \"collect-profiles-29564310-fg5t5\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.512634 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-config-volume\") pod \"collect-profiles-29564310-fg5t5\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.512688 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbtx\" (UniqueName: \"kubernetes.io/projected/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-kube-api-access-pdbtx\") pod \"collect-profiles-29564310-fg5t5\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.515300 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-config-volume\") pod \"collect-profiles-29564310-fg5t5\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.518057 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-secret-volume\") pod \"collect-profiles-29564310-fg5t5\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.530097 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbtx\" (UniqueName: \"kubernetes.io/projected/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-kube-api-access-pdbtx\") pod \"collect-profiles-29564310-fg5t5\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.599332 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.616395 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.715334 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-utilities\") pod \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.715430 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-catalog-content\") pod \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.715483 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfm7j\" (UniqueName: \"kubernetes.io/projected/6fb89930-99dd-4d82-9982-adf3b3aa39cf-kube-api-access-hfm7j\") pod \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\" (UID: \"6fb89930-99dd-4d82-9982-adf3b3aa39cf\") " Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.716308 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-utilities" (OuterVolumeSpecName: "utilities") pod "6fb89930-99dd-4d82-9982-adf3b3aa39cf" (UID: "6fb89930-99dd-4d82-9982-adf3b3aa39cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.719682 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb89930-99dd-4d82-9982-adf3b3aa39cf-kube-api-access-hfm7j" (OuterVolumeSpecName: "kube-api-access-hfm7j") pod "6fb89930-99dd-4d82-9982-adf3b3aa39cf" (UID: "6fb89930-99dd-4d82-9982-adf3b3aa39cf"). InnerVolumeSpecName "kube-api-access-hfm7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.752132 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fb89930-99dd-4d82-9982-adf3b3aa39cf" (UID: "6fb89930-99dd-4d82-9982-adf3b3aa39cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.816911 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.816951 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fb89930-99dd-4d82-9982-adf3b3aa39cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.816970 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfm7j\" (UniqueName: \"kubernetes.io/projected/6fb89930-99dd-4d82-9982-adf3b3aa39cf-kube-api-access-hfm7j\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:00 crc kubenswrapper[5008]: I0318 18:30:00.929279 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-gkp7r"] Mar 18 18:30:01 crc kubenswrapper[5008]: W0318 18:30:01.039656 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cf23e92_a320_42c0_a05e_fe7aa2ce261e.slice/crio-426987de9f5b85068de78f0e3b3406b348f813f312553df5ffc1d6f9206cfac3 WatchSource:0}: Error finding container 426987de9f5b85068de78f0e3b3406b348f813f312553df5ffc1d6f9206cfac3: Status 404 returned error can't find the container with id 426987de9f5b85068de78f0e3b3406b348f813f312553df5ffc1d6f9206cfac3 Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.042531 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5"] Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.171282 5008 generic.go:334] "Generic (PLEG): container finished" podID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerID="9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df" exitCode=0 Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.171393 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkgq8" Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.172642 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkgq8" event={"ID":"6fb89930-99dd-4d82-9982-adf3b3aa39cf","Type":"ContainerDied","Data":"9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df"} Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.172685 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkgq8" event={"ID":"6fb89930-99dd-4d82-9982-adf3b3aa39cf","Type":"ContainerDied","Data":"067d5fcaecd49dfad4e2c50f1b126ffe54c5db856ca8a297e99c648b62d380c7"} Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.172728 5008 scope.go:117] "RemoveContainer" containerID="9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df" Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.176233 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-gkp7r" event={"ID":"e689aa22-89d5-442d-8473-6bf84b521b85","Type":"ContainerStarted","Data":"0bb0261cbbf7a5cbe56b0d5a8c8b48a0eab2ac68ae50fc28e7472a105450dd12"} Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.179412 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" event={"ID":"1cf23e92-a320-42c0-a05e-fe7aa2ce261e","Type":"ContainerStarted","Data":"426987de9f5b85068de78f0e3b3406b348f813f312553df5ffc1d6f9206cfac3"} Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.189717 5008 scope.go:117] "RemoveContainer" containerID="5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec" Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.218204 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkgq8"] Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.223868 5008 scope.go:117] "RemoveContainer" containerID="5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f" Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.225861 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkgq8"] Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.243907 5008 scope.go:117] "RemoveContainer" containerID="9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df" Mar 18 18:30:01 crc kubenswrapper[5008]: E0318 18:30:01.244350 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df\": container with ID starting with 9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df not found: ID does not exist" containerID="9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df" Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.244390 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df"} err="failed to get container status \"9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df\": rpc error: code = NotFound desc = could not find container \"9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df\": container with ID starting with 9a9b2c82b07dc78c4ca3f3dfa777713cdb7013952332cc8b3efa8798092833df not found: ID does not exist" Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.244418 5008 scope.go:117] "RemoveContainer" containerID="5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec" Mar 18 18:30:01 crc kubenswrapper[5008]: E0318 18:30:01.244841 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec\": container with ID starting with 5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec not found: ID does not exist" containerID="5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec" Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.244877 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec"} err="failed to get container status \"5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec\": rpc error: code = NotFound desc = could not find container \"5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec\": container with ID starting with 5b29bbed5440a30fb27eae418f1dfee4ad7aa5eaa615bcf14d76437c18c3bcec not found: ID does not exist" Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.244900 5008 scope.go:117] "RemoveContainer" containerID="5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f" Mar 18 18:30:01 crc kubenswrapper[5008]: E0318 18:30:01.245334 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f\": container with ID starting with 5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f not found: ID does not exist" containerID="5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f" Mar 18 18:30:01 crc kubenswrapper[5008]: I0318 18:30:01.245355 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f"} err="failed to get container status \"5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f\": rpc error: code = NotFound desc = could not find container \"5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f\": container with ID starting with 5c3564a81c6da0f508ae7b059d5bfbf7719f0d4cf79b4b50f137982c6722a70f not found: ID does not exist" Mar 18 18:30:02 crc kubenswrapper[5008]: I0318 18:30:02.190025 5008 generic.go:334] "Generic (PLEG): container finished" podID="1cf23e92-a320-42c0-a05e-fe7aa2ce261e" containerID="21dc2cf38e138ecedfee59faa41639fb8c8c183a0eb8c1ecb0685e19f7bbde17" exitCode=0 Mar 18 18:30:02 crc kubenswrapper[5008]: I0318 18:30:02.190112 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" event={"ID":"1cf23e92-a320-42c0-a05e-fe7aa2ce261e","Type":"ContainerDied","Data":"21dc2cf38e138ecedfee59faa41639fb8c8c183a0eb8c1ecb0685e19f7bbde17"} Mar 18 18:30:02 crc kubenswrapper[5008]: I0318 18:30:02.210896 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" path="/var/lib/kubelet/pods/6fb89930-99dd-4d82-9982-adf3b3aa39cf/volumes" Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.201380 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-gkp7r" event={"ID":"e689aa22-89d5-442d-8473-6bf84b521b85","Type":"ContainerStarted","Data":"c2bf8d3619c0a40678d5c6947afe832138e74e68850571901ca3a40080bd71d2"} Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.218851 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564310-gkp7r" podStartSLOduration=1.349400542 podStartE2EDuration="3.218834812s" podCreationTimestamp="2026-03-18 18:30:00 +0000 UTC" firstStartedPulling="2026-03-18 18:30:00.938441145 +0000 UTC m=+1657.457914224" lastFinishedPulling="2026-03-18 18:30:02.807875415 +0000 UTC m=+1659.327348494" observedRunningTime="2026-03-18 18:30:03.212171525 +0000 UTC m=+1659.731644614" watchObservedRunningTime="2026-03-18 18:30:03.218834812 +0000 UTC m=+1659.738307881" Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.510220 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.658852 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdbtx\" (UniqueName: \"kubernetes.io/projected/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-kube-api-access-pdbtx\") pod \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.659398 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-config-volume\") pod \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.659511 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-secret-volume\") pod \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\" (UID: \"1cf23e92-a320-42c0-a05e-fe7aa2ce261e\") " Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.661229 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-config-volume" (OuterVolumeSpecName: "config-volume") pod "1cf23e92-a320-42c0-a05e-fe7aa2ce261e" (UID: "1cf23e92-a320-42c0-a05e-fe7aa2ce261e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.666746 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1cf23e92-a320-42c0-a05e-fe7aa2ce261e" (UID: "1cf23e92-a320-42c0-a05e-fe7aa2ce261e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.672432 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-kube-api-access-pdbtx" (OuterVolumeSpecName: "kube-api-access-pdbtx") pod "1cf23e92-a320-42c0-a05e-fe7aa2ce261e" (UID: "1cf23e92-a320-42c0-a05e-fe7aa2ce261e"). InnerVolumeSpecName "kube-api-access-pdbtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.763484 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdbtx\" (UniqueName: \"kubernetes.io/projected/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-kube-api-access-pdbtx\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.763520 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:03 crc kubenswrapper[5008]: I0318 18:30:03.763531 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1cf23e92-a320-42c0-a05e-fe7aa2ce261e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:04 crc kubenswrapper[5008]: I0318 18:30:04.209581 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:30:04 crc kubenswrapper[5008]: E0318 18:30:04.209981 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:30:04 crc kubenswrapper[5008]: I0318 18:30:04.214280 5008 generic.go:334] "Generic (PLEG): container finished" podID="e689aa22-89d5-442d-8473-6bf84b521b85" containerID="c2bf8d3619c0a40678d5c6947afe832138e74e68850571901ca3a40080bd71d2" exitCode=0 Mar 18 18:30:04 crc kubenswrapper[5008]: I0318 18:30:04.214437 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-gkp7r" event={"ID":"e689aa22-89d5-442d-8473-6bf84b521b85","Type":"ContainerDied","Data":"c2bf8d3619c0a40678d5c6947afe832138e74e68850571901ca3a40080bd71d2"} Mar 18 18:30:04 crc kubenswrapper[5008]: I0318 18:30:04.216876 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" event={"ID":"1cf23e92-a320-42c0-a05e-fe7aa2ce261e","Type":"ContainerDied","Data":"426987de9f5b85068de78f0e3b3406b348f813f312553df5ffc1d6f9206cfac3"} Mar 18 18:30:04 crc kubenswrapper[5008]: I0318 18:30:04.216942 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="426987de9f5b85068de78f0e3b3406b348f813f312553df5ffc1d6f9206cfac3" Mar 18 18:30:04 crc kubenswrapper[5008]: I0318 18:30:04.216943 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5" Mar 18 18:30:05 crc kubenswrapper[5008]: I0318 18:30:05.529033 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-gkp7r" Mar 18 18:30:05 crc kubenswrapper[5008]: I0318 18:30:05.693225 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd658\" (UniqueName: \"kubernetes.io/projected/e689aa22-89d5-442d-8473-6bf84b521b85-kube-api-access-jd658\") pod \"e689aa22-89d5-442d-8473-6bf84b521b85\" (UID: \"e689aa22-89d5-442d-8473-6bf84b521b85\") " Mar 18 18:30:05 crc kubenswrapper[5008]: I0318 18:30:05.697935 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e689aa22-89d5-442d-8473-6bf84b521b85-kube-api-access-jd658" (OuterVolumeSpecName: "kube-api-access-jd658") pod "e689aa22-89d5-442d-8473-6bf84b521b85" (UID: "e689aa22-89d5-442d-8473-6bf84b521b85"). InnerVolumeSpecName "kube-api-access-jd658". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:30:05 crc kubenswrapper[5008]: I0318 18:30:05.795343 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd658\" (UniqueName: \"kubernetes.io/projected/e689aa22-89d5-442d-8473-6bf84b521b85-kube-api-access-jd658\") on node \"crc\" DevicePath \"\"" Mar 18 18:30:06 crc kubenswrapper[5008]: I0318 18:30:06.234916 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564310-gkp7r" event={"ID":"e689aa22-89d5-442d-8473-6bf84b521b85","Type":"ContainerDied","Data":"0bb0261cbbf7a5cbe56b0d5a8c8b48a0eab2ac68ae50fc28e7472a105450dd12"} Mar 18 18:30:06 crc kubenswrapper[5008]: I0318 18:30:06.234980 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb0261cbbf7a5cbe56b0d5a8c8b48a0eab2ac68ae50fc28e7472a105450dd12" Mar 18 18:30:06 crc kubenswrapper[5008]: I0318 18:30:06.235045 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564310-gkp7r" Mar 18 18:30:06 crc kubenswrapper[5008]: I0318 18:30:06.381202 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-d5g9j"] Mar 18 18:30:06 crc kubenswrapper[5008]: I0318 18:30:06.388472 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564304-d5g9j"] Mar 18 18:30:08 crc kubenswrapper[5008]: I0318 18:30:08.208879 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117e64db-91f2-46f2-872e-2edba77b07d9" path="/var/lib/kubelet/pods/117e64db-91f2-46f2-872e-2edba77b07d9/volumes" Mar 18 18:30:19 crc kubenswrapper[5008]: I0318 18:30:19.198312 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:30:19 crc kubenswrapper[5008]: E0318 18:30:19.200327 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:30:31 crc kubenswrapper[5008]: I0318 18:30:31.198792 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:30:31 crc kubenswrapper[5008]: E0318 18:30:31.200060 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:30:39 crc kubenswrapper[5008]: I0318 18:30:39.494926 5008 scope.go:117] "RemoveContainer" containerID="a7ce89506274663821070b5c6dad5f8257699a1427ed13a75372ea3757a9a56c" Mar 18 18:30:39 crc kubenswrapper[5008]: I0318 18:30:39.556143 5008 scope.go:117] "RemoveContainer" containerID="a90c2927967c80cd7fe00d77b140f18b6966386ce76f1ff72c6c5b909aca51ee" Mar 18 18:30:39 crc kubenswrapper[5008]: I0318 18:30:39.607301 5008 scope.go:117] "RemoveContainer" containerID="37ee661f7953b8d9a32a6d1f71d0668b96eefbaeef42d315f3b9741a68892653" Mar 18 18:30:39 crc kubenswrapper[5008]: I0318 18:30:39.627949 5008 scope.go:117] "RemoveContainer" containerID="c22d63804d2fa8eaa1661c6774af139f50fcab700ab10feba4318bb64f3859aa" Mar 18 18:30:39 crc kubenswrapper[5008]: I0318 18:30:39.660014 5008 scope.go:117] "RemoveContainer" containerID="0467eb604fa1a269513b52ef89a127c20e5e7cac19e0e4d6ff17d11d5afda14e" Mar 18 18:30:46 crc kubenswrapper[5008]: I0318 18:30:46.198362 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:30:46 crc kubenswrapper[5008]: E0318 18:30:46.199627 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:30:59 crc kubenswrapper[5008]: I0318 18:30:59.198133 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:30:59 crc kubenswrapper[5008]: E0318 18:30:59.199041 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:31:14 crc kubenswrapper[5008]: I0318 18:31:14.205766 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:31:14 crc kubenswrapper[5008]: E0318 18:31:14.206581 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:31:25 crc kubenswrapper[5008]: I0318 18:31:25.198827 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:31:25 crc kubenswrapper[5008]: E0318 18:31:25.200132 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:31:38 crc kubenswrapper[5008]: I0318 18:31:38.198972 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:31:38 crc kubenswrapper[5008]: E0318 18:31:38.200013 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:31:39 crc kubenswrapper[5008]: I0318 18:31:39.820185 5008 scope.go:117] "RemoveContainer" containerID="3e4b194f5d89fb322983ad27203a5faf38b48a02e954c6de483187aa905e57d7" Mar 18 18:31:39 crc kubenswrapper[5008]: I0318 18:31:39.848190 5008 scope.go:117] "RemoveContainer" containerID="6aada1075429307e6ed4ed62a5ff39092be34f0cb0df6732338d9512f3d639a5" Mar 18 18:31:39 crc kubenswrapper[5008]: I0318 18:31:39.881826 5008 scope.go:117] "RemoveContainer" containerID="7ca9a60f117bb64900ea7b4debfbb4a22c4100b53a01f204c6a0b9b167d74ccc" Mar 18 18:31:39 crc kubenswrapper[5008]: I0318 18:31:39.908364 5008 scope.go:117] "RemoveContainer" containerID="8b7b60b95877bad7e80482410e97cfbd1d3a6413902243c890181903304f480d" Mar 18 18:31:39 crc kubenswrapper[5008]: I0318 18:31:39.957279 5008 scope.go:117] "RemoveContainer" containerID="f2d4a19da4bef551f9b95cdafd5e8f5836f3a1fef19b332a0a1d64c5b9a2f97e" Mar 18 18:31:39 crc kubenswrapper[5008]: I0318 18:31:39.993900 5008 scope.go:117] "RemoveContainer" containerID="77bd38d173d50c2a5f573f4d959340aa2ed52797345fe6a1d9a1c4829ec6805b" Mar 18 18:31:40 crc kubenswrapper[5008]: I0318 18:31:40.016896 5008 scope.go:117] "RemoveContainer" containerID="8aa1173c358dd95ef278aeaaebc5a33805fabac1f0468c56d4cf68beb6fd7318" Mar 18 18:31:40 crc kubenswrapper[5008]: I0318 18:31:40.052927 5008 scope.go:117] "RemoveContainer" containerID="1480f123b6b5a86f8bb731e4ddad88a0a42b8eaa878d35119d5cc9fa437d9dfd" Mar 18 18:31:40 crc kubenswrapper[5008]: I0318 18:31:40.067718 5008 scope.go:117] "RemoveContainer" containerID="ecc55674f283d6170293a51c52687d333d59880f248c46a61dc6f3dc815eed79" Mar 18 18:31:49 crc kubenswrapper[5008]: I0318 18:31:49.198374 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:31:49 crc kubenswrapper[5008]: E0318 18:31:49.199769 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.155181 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564312-gx7lw"] Mar 18 18:32:00 crc kubenswrapper[5008]: E0318 18:32:00.157388 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerName="registry-server" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.157442 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerName="registry-server" Mar 18 18:32:00 crc kubenswrapper[5008]: E0318 18:32:00.157469 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e689aa22-89d5-442d-8473-6bf84b521b85" containerName="oc" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.157485 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e689aa22-89d5-442d-8473-6bf84b521b85" containerName="oc" Mar 18 18:32:00 crc kubenswrapper[5008]: E0318 18:32:00.157516 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf23e92-a320-42c0-a05e-fe7aa2ce261e" containerName="collect-profiles" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.157531 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf23e92-a320-42c0-a05e-fe7aa2ce261e" containerName="collect-profiles" Mar 18 18:32:00 crc kubenswrapper[5008]: E0318 18:32:00.157589 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerName="extract-utilities" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.157604 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerName="extract-utilities" Mar 18 18:32:00 crc kubenswrapper[5008]: E0318 18:32:00.157635 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerName="extract-content" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.157648 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerName="extract-content" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.157923 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf23e92-a320-42c0-a05e-fe7aa2ce261e" containerName="collect-profiles" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.157951 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e689aa22-89d5-442d-8473-6bf84b521b85" containerName="oc" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.157984 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb89930-99dd-4d82-9982-adf3b3aa39cf" containerName="registry-server" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.159008 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564312-gx7lw" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.162501 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.163032 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.163430 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.168341 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564312-gx7lw"] Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.315911 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v4lh\" (UniqueName: \"kubernetes.io/projected/4cb7d7cb-b374-49f9-ac66-56c1229aa9f7-kube-api-access-4v4lh\") pod \"auto-csr-approver-29564312-gx7lw\" (UID: \"4cb7d7cb-b374-49f9-ac66-56c1229aa9f7\") " pod="openshift-infra/auto-csr-approver-29564312-gx7lw" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.418071 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v4lh\" (UniqueName: \"kubernetes.io/projected/4cb7d7cb-b374-49f9-ac66-56c1229aa9f7-kube-api-access-4v4lh\") pod \"auto-csr-approver-29564312-gx7lw\" (UID: \"4cb7d7cb-b374-49f9-ac66-56c1229aa9f7\") " pod="openshift-infra/auto-csr-approver-29564312-gx7lw" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.453522 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v4lh\" (UniqueName: \"kubernetes.io/projected/4cb7d7cb-b374-49f9-ac66-56c1229aa9f7-kube-api-access-4v4lh\") pod \"auto-csr-approver-29564312-gx7lw\" (UID: \"4cb7d7cb-b374-49f9-ac66-56c1229aa9f7\") " pod="openshift-infra/auto-csr-approver-29564312-gx7lw" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.483881 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564312-gx7lw" Mar 18 18:32:00 crc kubenswrapper[5008]: I0318 18:32:00.751930 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564312-gx7lw"] Mar 18 18:32:01 crc kubenswrapper[5008]: I0318 18:32:01.303230 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564312-gx7lw" event={"ID":"4cb7d7cb-b374-49f9-ac66-56c1229aa9f7","Type":"ContainerStarted","Data":"3ac06f649f53de5b6c78924d4253a6bfbe9e44b7fd28a1ec2e5c28b7e89a5606"} Mar 18 18:32:03 crc kubenswrapper[5008]: I0318 18:32:03.318693 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564312-gx7lw" event={"ID":"4cb7d7cb-b374-49f9-ac66-56c1229aa9f7","Type":"ContainerStarted","Data":"588423083d7b0d4e4a689a21050c1765b6c46f1c21c24744afe7b3f6aad7ff66"} Mar 18 18:32:03 crc kubenswrapper[5008]: I0318 18:32:03.332950 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564312-gx7lw" podStartSLOduration=1.211588574 podStartE2EDuration="3.332929037s" podCreationTimestamp="2026-03-18 18:32:00 +0000 UTC" firstStartedPulling="2026-03-18 18:32:00.757108297 +0000 UTC m=+1777.276581386" lastFinishedPulling="2026-03-18 18:32:02.87844875 +0000 UTC m=+1779.397921849" observedRunningTime="2026-03-18 18:32:03.331321334 +0000 UTC m=+1779.850794423" watchObservedRunningTime="2026-03-18 18:32:03.332929037 +0000 UTC m=+1779.852402126" Mar 18 18:32:04 crc kubenswrapper[5008]: I0318 18:32:04.203670 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:32:04 crc kubenswrapper[5008]: E0318 18:32:04.204104 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:32:04 crc kubenswrapper[5008]: I0318 18:32:04.333630 5008 generic.go:334] "Generic (PLEG): container finished" podID="4cb7d7cb-b374-49f9-ac66-56c1229aa9f7" containerID="588423083d7b0d4e4a689a21050c1765b6c46f1c21c24744afe7b3f6aad7ff66" exitCode=0 Mar 18 18:32:04 crc kubenswrapper[5008]: I0318 18:32:04.333705 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564312-gx7lw" event={"ID":"4cb7d7cb-b374-49f9-ac66-56c1229aa9f7","Type":"ContainerDied","Data":"588423083d7b0d4e4a689a21050c1765b6c46f1c21c24744afe7b3f6aad7ff66"} Mar 18 18:32:05 crc kubenswrapper[5008]: I0318 18:32:05.932623 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564312-gx7lw" Mar 18 18:32:06 crc kubenswrapper[5008]: I0318 18:32:06.108157 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v4lh\" (UniqueName: \"kubernetes.io/projected/4cb7d7cb-b374-49f9-ac66-56c1229aa9f7-kube-api-access-4v4lh\") pod \"4cb7d7cb-b374-49f9-ac66-56c1229aa9f7\" (UID: \"4cb7d7cb-b374-49f9-ac66-56c1229aa9f7\") " Mar 18 18:32:06 crc kubenswrapper[5008]: I0318 18:32:06.114675 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb7d7cb-b374-49f9-ac66-56c1229aa9f7-kube-api-access-4v4lh" (OuterVolumeSpecName: "kube-api-access-4v4lh") pod "4cb7d7cb-b374-49f9-ac66-56c1229aa9f7" (UID: "4cb7d7cb-b374-49f9-ac66-56c1229aa9f7"). InnerVolumeSpecName "kube-api-access-4v4lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:32:06 crc kubenswrapper[5008]: I0318 18:32:06.210105 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v4lh\" (UniqueName: \"kubernetes.io/projected/4cb7d7cb-b374-49f9-ac66-56c1229aa9f7-kube-api-access-4v4lh\") on node \"crc\" DevicePath \"\"" Mar 18 18:32:06 crc kubenswrapper[5008]: I0318 18:32:06.353662 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564312-gx7lw" event={"ID":"4cb7d7cb-b374-49f9-ac66-56c1229aa9f7","Type":"ContainerDied","Data":"3ac06f649f53de5b6c78924d4253a6bfbe9e44b7fd28a1ec2e5c28b7e89a5606"} Mar 18 18:32:06 crc kubenswrapper[5008]: I0318 18:32:06.353726 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ac06f649f53de5b6c78924d4253a6bfbe9e44b7fd28a1ec2e5c28b7e89a5606" Mar 18 18:32:06 crc kubenswrapper[5008]: I0318 18:32:06.353797 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564312-gx7lw" Mar 18 18:32:06 crc kubenswrapper[5008]: I0318 18:32:06.412918 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-zjwp8"] Mar 18 18:32:06 crc kubenswrapper[5008]: I0318 18:32:06.420225 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564306-zjwp8"] Mar 18 18:32:08 crc kubenswrapper[5008]: I0318 18:32:08.214739 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e86e59-8c2c-45e0-b661-4e9e815e138b" path="/var/lib/kubelet/pods/26e86e59-8c2c-45e0-b661-4e9e815e138b/volumes" Mar 18 18:32:16 crc kubenswrapper[5008]: I0318 18:32:16.198654 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:32:16 crc kubenswrapper[5008]: E0318 18:32:16.199417 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:32:30 crc kubenswrapper[5008]: I0318 18:32:30.198871 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:32:30 crc kubenswrapper[5008]: E0318 18:32:30.214115 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:32:40 crc kubenswrapper[5008]: I0318 18:32:40.280946 5008 scope.go:117] "RemoveContainer" containerID="636a655d1d4b129edd75d71f58c73e9276309075062203288ef5db3fbb347e6f" Mar 18 18:32:40 crc kubenswrapper[5008]: I0318 18:32:40.337291 5008 scope.go:117] "RemoveContainer" containerID="2e5a59c9d7fb057e0e04ebc89497f693bac1a00de19d2d85b96fb8e2a863fd3b" Mar 18 18:32:41 crc kubenswrapper[5008]: I0318 18:32:41.198598 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:32:41 crc kubenswrapper[5008]: E0318 18:32:41.199023 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:32:55 crc kubenswrapper[5008]: I0318 18:32:55.198619 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:32:55 crc kubenswrapper[5008]: E0318 18:32:55.199265 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:33:08 crc kubenswrapper[5008]: I0318 18:33:08.199258 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:33:08 crc kubenswrapper[5008]: E0318 18:33:08.200444 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:33:22 crc kubenswrapper[5008]: I0318 18:33:22.198200 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:33:22 crc kubenswrapper[5008]: E0318 18:33:22.199281 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:33:34 crc kubenswrapper[5008]: I0318 18:33:34.207467 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:33:35 crc kubenswrapper[5008]: I0318 18:33:35.149103 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"6a517949b9ff0064573ecb8a2b93943d6bc661b4978c01e5e21d16dfc8f19892"} Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.164588 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564314-zccgd"] Mar 18 18:34:00 crc kubenswrapper[5008]: E0318 18:34:00.166331 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb7d7cb-b374-49f9-ac66-56c1229aa9f7" containerName="oc" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.166360 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb7d7cb-b374-49f9-ac66-56c1229aa9f7" containerName="oc" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.166669 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb7d7cb-b374-49f9-ac66-56c1229aa9f7" containerName="oc" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.167373 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564314-zccgd" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.175473 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.175668 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.177206 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.179309 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564314-zccgd"] Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.336276 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s475s\" (UniqueName: \"kubernetes.io/projected/d1465280-b484-42fc-9162-101f8fb8f308-kube-api-access-s475s\") pod \"auto-csr-approver-29564314-zccgd\" (UID: \"d1465280-b484-42fc-9162-101f8fb8f308\") " pod="openshift-infra/auto-csr-approver-29564314-zccgd" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.437873 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s475s\" (UniqueName: \"kubernetes.io/projected/d1465280-b484-42fc-9162-101f8fb8f308-kube-api-access-s475s\") pod \"auto-csr-approver-29564314-zccgd\" (UID: \"d1465280-b484-42fc-9162-101f8fb8f308\") " pod="openshift-infra/auto-csr-approver-29564314-zccgd" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.466298 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s475s\" (UniqueName: \"kubernetes.io/projected/d1465280-b484-42fc-9162-101f8fb8f308-kube-api-access-s475s\") pod \"auto-csr-approver-29564314-zccgd\" (UID: \"d1465280-b484-42fc-9162-101f8fb8f308\") " pod="openshift-infra/auto-csr-approver-29564314-zccgd" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.502395 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564314-zccgd" Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.984961 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564314-zccgd"] Mar 18 18:34:00 crc kubenswrapper[5008]: W0318 18:34:00.993925 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1465280_b484_42fc_9162_101f8fb8f308.slice/crio-032910b303a56d02367f6d0bc6db61eaafe670d1e4ee9a40c1c7f2166ba1a751 WatchSource:0}: Error finding container 032910b303a56d02367f6d0bc6db61eaafe670d1e4ee9a40c1c7f2166ba1a751: Status 404 returned error can't find the container with id 032910b303a56d02367f6d0bc6db61eaafe670d1e4ee9a40c1c7f2166ba1a751 Mar 18 18:34:00 crc kubenswrapper[5008]: I0318 18:34:00.997585 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:34:01 crc kubenswrapper[5008]: I0318 18:34:01.385216 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564314-zccgd" event={"ID":"d1465280-b484-42fc-9162-101f8fb8f308","Type":"ContainerStarted","Data":"032910b303a56d02367f6d0bc6db61eaafe670d1e4ee9a40c1c7f2166ba1a751"} Mar 18 18:34:03 crc kubenswrapper[5008]: I0318 18:34:03.402777 5008 generic.go:334] "Generic (PLEG): container finished" podID="d1465280-b484-42fc-9162-101f8fb8f308" containerID="eda1ccd22e49549bc71e9b1bd2a7f687ce2d7cb44a4eb2cba1b9aaa287224bd5" exitCode=0 Mar 18 18:34:03 crc kubenswrapper[5008]: I0318 18:34:03.402843 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564314-zccgd" event={"ID":"d1465280-b484-42fc-9162-101f8fb8f308","Type":"ContainerDied","Data":"eda1ccd22e49549bc71e9b1bd2a7f687ce2d7cb44a4eb2cba1b9aaa287224bd5"} Mar 18 18:34:04 crc kubenswrapper[5008]: I0318 18:34:04.713992 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564314-zccgd" Mar 18 18:34:04 crc kubenswrapper[5008]: I0318 18:34:04.804314 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s475s\" (UniqueName: \"kubernetes.io/projected/d1465280-b484-42fc-9162-101f8fb8f308-kube-api-access-s475s\") pod \"d1465280-b484-42fc-9162-101f8fb8f308\" (UID: \"d1465280-b484-42fc-9162-101f8fb8f308\") " Mar 18 18:34:04 crc kubenswrapper[5008]: I0318 18:34:04.813452 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1465280-b484-42fc-9162-101f8fb8f308-kube-api-access-s475s" (OuterVolumeSpecName: "kube-api-access-s475s") pod "d1465280-b484-42fc-9162-101f8fb8f308" (UID: "d1465280-b484-42fc-9162-101f8fb8f308"). InnerVolumeSpecName "kube-api-access-s475s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:34:04 crc kubenswrapper[5008]: I0318 18:34:04.906602 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s475s\" (UniqueName: \"kubernetes.io/projected/d1465280-b484-42fc-9162-101f8fb8f308-kube-api-access-s475s\") on node \"crc\" DevicePath \"\"" Mar 18 18:34:05 crc kubenswrapper[5008]: I0318 18:34:05.424404 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564314-zccgd" event={"ID":"d1465280-b484-42fc-9162-101f8fb8f308","Type":"ContainerDied","Data":"032910b303a56d02367f6d0bc6db61eaafe670d1e4ee9a40c1c7f2166ba1a751"} Mar 18 18:34:05 crc kubenswrapper[5008]: I0318 18:34:05.424452 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="032910b303a56d02367f6d0bc6db61eaafe670d1e4ee9a40c1c7f2166ba1a751" Mar 18 18:34:05 crc kubenswrapper[5008]: I0318 18:34:05.424967 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564314-zccgd" Mar 18 18:34:05 crc kubenswrapper[5008]: I0318 18:34:05.807708 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-vdh85"] Mar 18 18:34:05 crc kubenswrapper[5008]: I0318 18:34:05.814426 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564308-vdh85"] Mar 18 18:34:06 crc kubenswrapper[5008]: I0318 18:34:06.210533 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc5fc756-c281-47a2-bc46-1fbb55f5677c" path="/var/lib/kubelet/pods/cc5fc756-c281-47a2-bc46-1fbb55f5677c/volumes" Mar 18 18:34:40 crc kubenswrapper[5008]: I0318 18:34:40.461798 5008 scope.go:117] "RemoveContainer" containerID="2d17ffe491997283a42141fddc12fbcd7818d363abf1384a438224fc59e31093" Mar 18 18:35:54 crc kubenswrapper[5008]: I0318 18:35:54.460009 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:35:54 crc kubenswrapper[5008]: I0318 18:35:54.460775 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.161079 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564316-z8lsc"] Mar 18 18:36:00 crc kubenswrapper[5008]: E0318 18:36:00.161914 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1465280-b484-42fc-9162-101f8fb8f308" containerName="oc" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.161935 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1465280-b484-42fc-9162-101f8fb8f308" containerName="oc" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.162162 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1465280-b484-42fc-9162-101f8fb8f308" containerName="oc" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.162877 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564316-z8lsc" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.165435 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.165664 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.167085 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.169500 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564316-z8lsc"] Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.295047 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phbmb\" (UniqueName: \"kubernetes.io/projected/16726099-fb12-4782-ae36-7bd417eaac45-kube-api-access-phbmb\") pod \"auto-csr-approver-29564316-z8lsc\" (UID: \"16726099-fb12-4782-ae36-7bd417eaac45\") " pod="openshift-infra/auto-csr-approver-29564316-z8lsc" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.396342 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phbmb\" (UniqueName: \"kubernetes.io/projected/16726099-fb12-4782-ae36-7bd417eaac45-kube-api-access-phbmb\") pod \"auto-csr-approver-29564316-z8lsc\" (UID: \"16726099-fb12-4782-ae36-7bd417eaac45\") " pod="openshift-infra/auto-csr-approver-29564316-z8lsc" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.417482 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phbmb\" (UniqueName: \"kubernetes.io/projected/16726099-fb12-4782-ae36-7bd417eaac45-kube-api-access-phbmb\") pod \"auto-csr-approver-29564316-z8lsc\" (UID: \"16726099-fb12-4782-ae36-7bd417eaac45\") " pod="openshift-infra/auto-csr-approver-29564316-z8lsc" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.482858 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564316-z8lsc" Mar 18 18:36:00 crc kubenswrapper[5008]: I0318 18:36:00.959108 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564316-z8lsc"] Mar 18 18:36:01 crc kubenswrapper[5008]: I0318 18:36:01.395605 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564316-z8lsc" event={"ID":"16726099-fb12-4782-ae36-7bd417eaac45","Type":"ContainerStarted","Data":"9a6b5468a5a522adee5b57c6e72f6f38dca48b75adf5c6696d32c4f0faaa27a7"} Mar 18 18:36:02 crc kubenswrapper[5008]: I0318 18:36:02.404517 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564316-z8lsc" event={"ID":"16726099-fb12-4782-ae36-7bd417eaac45","Type":"ContainerStarted","Data":"27d10a4f92b5966695a1e0d5444ab0aadf418f9d504e9af1fac6848f552933ac"} Mar 18 18:36:02 crc kubenswrapper[5008]: I0318 18:36:02.427703 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564316-z8lsc" podStartSLOduration=1.4565314169999999 podStartE2EDuration="2.427674958s" podCreationTimestamp="2026-03-18 18:36:00 +0000 UTC" firstStartedPulling="2026-03-18 18:36:00.971310866 +0000 UTC m=+2017.490783945" lastFinishedPulling="2026-03-18 18:36:01.942454387 +0000 UTC m=+2018.461927486" observedRunningTime="2026-03-18 18:36:02.418633118 +0000 UTC m=+2018.938106227" watchObservedRunningTime="2026-03-18 18:36:02.427674958 +0000 UTC m=+2018.947148057" Mar 18 18:36:03 crc kubenswrapper[5008]: I0318 18:36:03.417035 5008 generic.go:334] "Generic (PLEG): container finished" podID="16726099-fb12-4782-ae36-7bd417eaac45" containerID="27d10a4f92b5966695a1e0d5444ab0aadf418f9d504e9af1fac6848f552933ac" exitCode=0 Mar 18 18:36:03 crc kubenswrapper[5008]: I0318 18:36:03.417114 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564316-z8lsc" event={"ID":"16726099-fb12-4782-ae36-7bd417eaac45","Type":"ContainerDied","Data":"27d10a4f92b5966695a1e0d5444ab0aadf418f9d504e9af1fac6848f552933ac"} Mar 18 18:36:04 crc kubenswrapper[5008]: I0318 18:36:04.834417 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564316-z8lsc" Mar 18 18:36:05 crc kubenswrapper[5008]: I0318 18:36:05.003613 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phbmb\" (UniqueName: \"kubernetes.io/projected/16726099-fb12-4782-ae36-7bd417eaac45-kube-api-access-phbmb\") pod \"16726099-fb12-4782-ae36-7bd417eaac45\" (UID: \"16726099-fb12-4782-ae36-7bd417eaac45\") " Mar 18 18:36:05 crc kubenswrapper[5008]: I0318 18:36:05.009068 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16726099-fb12-4782-ae36-7bd417eaac45-kube-api-access-phbmb" (OuterVolumeSpecName: "kube-api-access-phbmb") pod "16726099-fb12-4782-ae36-7bd417eaac45" (UID: "16726099-fb12-4782-ae36-7bd417eaac45"). InnerVolumeSpecName "kube-api-access-phbmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:36:05 crc kubenswrapper[5008]: I0318 18:36:05.105358 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phbmb\" (UniqueName: \"kubernetes.io/projected/16726099-fb12-4782-ae36-7bd417eaac45-kube-api-access-phbmb\") on node \"crc\" DevicePath \"\"" Mar 18 18:36:05 crc kubenswrapper[5008]: I0318 18:36:05.436028 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564316-z8lsc" event={"ID":"16726099-fb12-4782-ae36-7bd417eaac45","Type":"ContainerDied","Data":"9a6b5468a5a522adee5b57c6e72f6f38dca48b75adf5c6696d32c4f0faaa27a7"} Mar 18 18:36:05 crc kubenswrapper[5008]: I0318 18:36:05.436097 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a6b5468a5a522adee5b57c6e72f6f38dca48b75adf5c6696d32c4f0faaa27a7" Mar 18 18:36:05 crc kubenswrapper[5008]: I0318 18:36:05.436140 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564316-z8lsc" Mar 18 18:36:05 crc kubenswrapper[5008]: I0318 18:36:05.507200 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-gkp7r"] Mar 18 18:36:05 crc kubenswrapper[5008]: I0318 18:36:05.513986 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564310-gkp7r"] Mar 18 18:36:06 crc kubenswrapper[5008]: I0318 18:36:06.214057 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e689aa22-89d5-442d-8473-6bf84b521b85" path="/var/lib/kubelet/pods/e689aa22-89d5-442d-8473-6bf84b521b85/volumes" Mar 18 18:36:24 crc kubenswrapper[5008]: I0318 18:36:24.460480 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:36:24 crc kubenswrapper[5008]: I0318 18:36:24.461393 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:36:40 crc kubenswrapper[5008]: I0318 18:36:40.568902 5008 scope.go:117] "RemoveContainer" containerID="c2bf8d3619c0a40678d5c6947afe832138e74e68850571901ca3a40080bd71d2" Mar 18 18:36:47 crc kubenswrapper[5008]: I0318 18:36:47.968087 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xdz9k"] Mar 18 18:36:47 crc kubenswrapper[5008]: E0318 18:36:47.969205 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16726099-fb12-4782-ae36-7bd417eaac45" containerName="oc" Mar 18 18:36:47 crc kubenswrapper[5008]: I0318 18:36:47.969225 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="16726099-fb12-4782-ae36-7bd417eaac45" containerName="oc" Mar 18 18:36:47 crc kubenswrapper[5008]: I0318 18:36:47.969441 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="16726099-fb12-4782-ae36-7bd417eaac45" containerName="oc" Mar 18 18:36:47 crc kubenswrapper[5008]: I0318 18:36:47.971005 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:47 crc kubenswrapper[5008]: I0318 18:36:47.992948 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xdz9k"] Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.156523 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-catalog-content\") pod \"community-operators-xdz9k\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.156693 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-utilities\") pod \"community-operators-xdz9k\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.156738 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4j7j\" (UniqueName: \"kubernetes.io/projected/9e565e85-8023-4a2b-b989-4bec0108cbf5-kube-api-access-l4j7j\") pod \"community-operators-xdz9k\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.258268 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4j7j\" (UniqueName: \"kubernetes.io/projected/9e565e85-8023-4a2b-b989-4bec0108cbf5-kube-api-access-l4j7j\") pod \"community-operators-xdz9k\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.258338 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-catalog-content\") pod \"community-operators-xdz9k\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.258405 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-utilities\") pod \"community-operators-xdz9k\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.258849 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-utilities\") pod \"community-operators-xdz9k\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.258958 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-catalog-content\") pod \"community-operators-xdz9k\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.281228 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4j7j\" (UniqueName: \"kubernetes.io/projected/9e565e85-8023-4a2b-b989-4bec0108cbf5-kube-api-access-l4j7j\") pod \"community-operators-xdz9k\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.298335 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.567607 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xdz9k"] Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.874258 5008 generic.go:334] "Generic (PLEG): container finished" podID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerID="4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6" exitCode=0 Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.874304 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz9k" event={"ID":"9e565e85-8023-4a2b-b989-4bec0108cbf5","Type":"ContainerDied","Data":"4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6"} Mar 18 18:36:48 crc kubenswrapper[5008]: I0318 18:36:48.874864 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz9k" event={"ID":"9e565e85-8023-4a2b-b989-4bec0108cbf5","Type":"ContainerStarted","Data":"95b33ab2b524e72a398905c91e585a912115a66cdff618b36f125d2f7177fa41"} Mar 18 18:36:49 crc kubenswrapper[5008]: I0318 18:36:49.886671 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz9k" event={"ID":"9e565e85-8023-4a2b-b989-4bec0108cbf5","Type":"ContainerStarted","Data":"6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686"} Mar 18 18:36:50 crc kubenswrapper[5008]: I0318 18:36:50.900089 5008 generic.go:334] "Generic (PLEG): container finished" podID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerID="6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686" exitCode=0 Mar 18 18:36:50 crc kubenswrapper[5008]: I0318 18:36:50.900173 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz9k" event={"ID":"9e565e85-8023-4a2b-b989-4bec0108cbf5","Type":"ContainerDied","Data":"6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686"} Mar 18 18:36:52 crc kubenswrapper[5008]: I0318 18:36:52.920487 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz9k" event={"ID":"9e565e85-8023-4a2b-b989-4bec0108cbf5","Type":"ContainerStarted","Data":"1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08"} Mar 18 18:36:52 crc kubenswrapper[5008]: I0318 18:36:52.954270 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xdz9k" podStartSLOduration=2.373317737 podStartE2EDuration="5.954240047s" podCreationTimestamp="2026-03-18 18:36:47 +0000 UTC" firstStartedPulling="2026-03-18 18:36:48.875971795 +0000 UTC m=+2065.395444874" lastFinishedPulling="2026-03-18 18:36:52.456894105 +0000 UTC m=+2068.976367184" observedRunningTime="2026-03-18 18:36:52.948909376 +0000 UTC m=+2069.468382525" watchObservedRunningTime="2026-03-18 18:36:52.954240047 +0000 UTC m=+2069.473713166" Mar 18 18:36:54 crc kubenswrapper[5008]: I0318 18:36:54.460052 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:36:54 crc kubenswrapper[5008]: I0318 18:36:54.460135 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:36:54 crc kubenswrapper[5008]: I0318 18:36:54.460201 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:36:54 crc kubenswrapper[5008]: I0318 18:36:54.461184 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a517949b9ff0064573ecb8a2b93943d6bc661b4978c01e5e21d16dfc8f19892"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:36:54 crc kubenswrapper[5008]: I0318 18:36:54.461282 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://6a517949b9ff0064573ecb8a2b93943d6bc661b4978c01e5e21d16dfc8f19892" gracePeriod=600 Mar 18 18:36:54 crc kubenswrapper[5008]: I0318 18:36:54.938571 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="6a517949b9ff0064573ecb8a2b93943d6bc661b4978c01e5e21d16dfc8f19892" exitCode=0 Mar 18 18:36:54 crc kubenswrapper[5008]: I0318 18:36:54.938589 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"6a517949b9ff0064573ecb8a2b93943d6bc661b4978c01e5e21d16dfc8f19892"} Mar 18 18:36:54 crc kubenswrapper[5008]: I0318 18:36:54.938916 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c"} Mar 18 18:36:54 crc kubenswrapper[5008]: I0318 18:36:54.938943 5008 scope.go:117] "RemoveContainer" containerID="ece97d8d6bcefac637bb22f6ab25d08ce5eb042425f7093c077cff51343b5241" Mar 18 18:36:58 crc kubenswrapper[5008]: I0318 18:36:58.298697 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:58 crc kubenswrapper[5008]: I0318 18:36:58.299225 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:58 crc kubenswrapper[5008]: I0318 18:36:58.345895 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:59 crc kubenswrapper[5008]: I0318 18:36:59.066460 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:36:59 crc kubenswrapper[5008]: I0318 18:36:59.167860 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xdz9k"] Mar 18 18:37:00 crc kubenswrapper[5008]: I0318 18:37:00.985412 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hvt2w"] Mar 18 18:37:00 crc kubenswrapper[5008]: I0318 18:37:00.989773 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.011938 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xdz9k" podUID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerName="registry-server" containerID="cri-o://1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08" gracePeriod=2 Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.030068 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvt2w"] Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.074850 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-utilities\") pod \"redhat-operators-hvt2w\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.074946 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-catalog-content\") pod \"redhat-operators-hvt2w\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.075048 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4b77\" (UniqueName: \"kubernetes.io/projected/e8a16325-dfb7-4283-8338-8df7160a978a-kube-api-access-f4b77\") pod \"redhat-operators-hvt2w\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.176320 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-utilities\") pod \"redhat-operators-hvt2w\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.176446 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-catalog-content\") pod \"redhat-operators-hvt2w\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.176818 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4b77\" (UniqueName: \"kubernetes.io/projected/e8a16325-dfb7-4283-8338-8df7160a978a-kube-api-access-f4b77\") pod \"redhat-operators-hvt2w\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.176942 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-utilities\") pod \"redhat-operators-hvt2w\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.177485 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-catalog-content\") pod \"redhat-operators-hvt2w\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.200423 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4b77\" (UniqueName: \"kubernetes.io/projected/e8a16325-dfb7-4283-8338-8df7160a978a-kube-api-access-f4b77\") pod \"redhat-operators-hvt2w\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.336022 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.419087 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.584194 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4j7j\" (UniqueName: \"kubernetes.io/projected/9e565e85-8023-4a2b-b989-4bec0108cbf5-kube-api-access-l4j7j\") pod \"9e565e85-8023-4a2b-b989-4bec0108cbf5\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.584234 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-catalog-content\") pod \"9e565e85-8023-4a2b-b989-4bec0108cbf5\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.584262 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-utilities\") pod \"9e565e85-8023-4a2b-b989-4bec0108cbf5\" (UID: \"9e565e85-8023-4a2b-b989-4bec0108cbf5\") " Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.585572 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-utilities" (OuterVolumeSpecName: "utilities") pod "9e565e85-8023-4a2b-b989-4bec0108cbf5" (UID: "9e565e85-8023-4a2b-b989-4bec0108cbf5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.601343 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e565e85-8023-4a2b-b989-4bec0108cbf5-kube-api-access-l4j7j" (OuterVolumeSpecName: "kube-api-access-l4j7j") pod "9e565e85-8023-4a2b-b989-4bec0108cbf5" (UID: "9e565e85-8023-4a2b-b989-4bec0108cbf5"). InnerVolumeSpecName "kube-api-access-l4j7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.654388 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e565e85-8023-4a2b-b989-4bec0108cbf5" (UID: "9e565e85-8023-4a2b-b989-4bec0108cbf5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.685955 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4j7j\" (UniqueName: \"kubernetes.io/projected/9e565e85-8023-4a2b-b989-4bec0108cbf5-kube-api-access-l4j7j\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.686260 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.686436 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e565e85-8023-4a2b-b989-4bec0108cbf5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:01 crc kubenswrapper[5008]: I0318 18:37:01.765537 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvt2w"] Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.024985 5008 generic.go:334] "Generic (PLEG): container finished" podID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerID="1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08" exitCode=0 Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.025121 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdz9k" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.025798 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz9k" event={"ID":"9e565e85-8023-4a2b-b989-4bec0108cbf5","Type":"ContainerDied","Data":"1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08"} Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.025842 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz9k" event={"ID":"9e565e85-8023-4a2b-b989-4bec0108cbf5","Type":"ContainerDied","Data":"95b33ab2b524e72a398905c91e585a912115a66cdff618b36f125d2f7177fa41"} Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.025874 5008 scope.go:117] "RemoveContainer" containerID="1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.028748 5008 generic.go:334] "Generic (PLEG): container finished" podID="e8a16325-dfb7-4283-8338-8df7160a978a" containerID="2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa" exitCode=0 Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.028790 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvt2w" event={"ID":"e8a16325-dfb7-4283-8338-8df7160a978a","Type":"ContainerDied","Data":"2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa"} Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.028818 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvt2w" event={"ID":"e8a16325-dfb7-4283-8338-8df7160a978a","Type":"ContainerStarted","Data":"29eb17dbad509013efcc2551885aeeac7685300bd3b8469b3fa1112dfc234dff"} Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.061671 5008 scope.go:117] "RemoveContainer" containerID="6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.071832 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xdz9k"] Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.080014 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xdz9k"] Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.088003 5008 scope.go:117] "RemoveContainer" containerID="4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.102474 5008 scope.go:117] "RemoveContainer" containerID="1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08" Mar 18 18:37:02 crc kubenswrapper[5008]: E0318 18:37:02.102981 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08\": container with ID starting with 1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08 not found: ID does not exist" containerID="1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.103025 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08"} err="failed to get container status \"1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08\": rpc error: code = NotFound desc = could not find container \"1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08\": container with ID starting with 1505e11d7445a59e7632fed0b8602110b3fbe6b69e22ff0d0b14e9f9e1117f08 not found: ID does not exist" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.103053 5008 scope.go:117] "RemoveContainer" containerID="6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686" Mar 18 18:37:02 crc kubenswrapper[5008]: E0318 18:37:02.103396 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686\": container with ID starting with 6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686 not found: ID does not exist" containerID="6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.103433 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686"} err="failed to get container status \"6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686\": rpc error: code = NotFound desc = could not find container \"6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686\": container with ID starting with 6350d6110b2f3892c41628b7e56252a3d0e81ac6ebcd5b43163c946ff1015686 not found: ID does not exist" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.103455 5008 scope.go:117] "RemoveContainer" containerID="4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6" Mar 18 18:37:02 crc kubenswrapper[5008]: E0318 18:37:02.103733 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6\": container with ID starting with 4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6 not found: ID does not exist" containerID="4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.103761 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6"} err="failed to get container status \"4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6\": rpc error: code = NotFound desc = could not find container \"4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6\": container with ID starting with 4e01595817ffedfc9eb2653e1e2c90b13a13e0ee711b37182f950c7d0f16def6 not found: ID does not exist" Mar 18 18:37:02 crc kubenswrapper[5008]: I0318 18:37:02.206387 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e565e85-8023-4a2b-b989-4bec0108cbf5" path="/var/lib/kubelet/pods/9e565e85-8023-4a2b-b989-4bec0108cbf5/volumes" Mar 18 18:37:04 crc kubenswrapper[5008]: I0318 18:37:04.050697 5008 generic.go:334] "Generic (PLEG): container finished" podID="e8a16325-dfb7-4283-8338-8df7160a978a" containerID="1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23" exitCode=0 Mar 18 18:37:04 crc kubenswrapper[5008]: I0318 18:37:04.050908 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvt2w" event={"ID":"e8a16325-dfb7-4283-8338-8df7160a978a","Type":"ContainerDied","Data":"1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23"} Mar 18 18:37:05 crc kubenswrapper[5008]: I0318 18:37:05.061969 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvt2w" event={"ID":"e8a16325-dfb7-4283-8338-8df7160a978a","Type":"ContainerStarted","Data":"4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0"} Mar 18 18:37:05 crc kubenswrapper[5008]: I0318 18:37:05.098391 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hvt2w" podStartSLOduration=2.65649442 podStartE2EDuration="5.098353422s" podCreationTimestamp="2026-03-18 18:37:00 +0000 UTC" firstStartedPulling="2026-03-18 18:37:02.030736269 +0000 UTC m=+2078.550209348" lastFinishedPulling="2026-03-18 18:37:04.472595241 +0000 UTC m=+2080.992068350" observedRunningTime="2026-03-18 18:37:05.094206922 +0000 UTC m=+2081.613680071" watchObservedRunningTime="2026-03-18 18:37:05.098353422 +0000 UTC m=+2081.617826541" Mar 18 18:37:11 crc kubenswrapper[5008]: I0318 18:37:11.336399 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:11 crc kubenswrapper[5008]: I0318 18:37:11.337190 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:12 crc kubenswrapper[5008]: I0318 18:37:12.392089 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvt2w" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" containerName="registry-server" probeResult="failure" output=< Mar 18 18:37:12 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 18:37:12 crc kubenswrapper[5008]: > Mar 18 18:37:21 crc kubenswrapper[5008]: I0318 18:37:21.396448 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:21 crc kubenswrapper[5008]: I0318 18:37:21.449993 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:21 crc kubenswrapper[5008]: I0318 18:37:21.634306 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvt2w"] Mar 18 18:37:23 crc kubenswrapper[5008]: I0318 18:37:23.254217 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hvt2w" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" containerName="registry-server" containerID="cri-o://4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0" gracePeriod=2 Mar 18 18:37:23 crc kubenswrapper[5008]: I0318 18:37:23.710243 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:23 crc kubenswrapper[5008]: I0318 18:37:23.877763 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-catalog-content\") pod \"e8a16325-dfb7-4283-8338-8df7160a978a\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " Mar 18 18:37:23 crc kubenswrapper[5008]: I0318 18:37:23.877989 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4b77\" (UniqueName: \"kubernetes.io/projected/e8a16325-dfb7-4283-8338-8df7160a978a-kube-api-access-f4b77\") pod \"e8a16325-dfb7-4283-8338-8df7160a978a\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " Mar 18 18:37:23 crc kubenswrapper[5008]: I0318 18:37:23.878212 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-utilities\") pod \"e8a16325-dfb7-4283-8338-8df7160a978a\" (UID: \"e8a16325-dfb7-4283-8338-8df7160a978a\") " Mar 18 18:37:23 crc kubenswrapper[5008]: I0318 18:37:23.879204 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-utilities" (OuterVolumeSpecName: "utilities") pod "e8a16325-dfb7-4283-8338-8df7160a978a" (UID: "e8a16325-dfb7-4283-8338-8df7160a978a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:37:23 crc kubenswrapper[5008]: I0318 18:37:23.892707 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a16325-dfb7-4283-8338-8df7160a978a-kube-api-access-f4b77" (OuterVolumeSpecName: "kube-api-access-f4b77") pod "e8a16325-dfb7-4283-8338-8df7160a978a" (UID: "e8a16325-dfb7-4283-8338-8df7160a978a"). InnerVolumeSpecName "kube-api-access-f4b77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:37:23 crc kubenswrapper[5008]: I0318 18:37:23.979686 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:23 crc kubenswrapper[5008]: I0318 18:37:23.979736 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4b77\" (UniqueName: \"kubernetes.io/projected/e8a16325-dfb7-4283-8338-8df7160a978a-kube-api-access-f4b77\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.043184 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8a16325-dfb7-4283-8338-8df7160a978a" (UID: "e8a16325-dfb7-4283-8338-8df7160a978a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.080612 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8a16325-dfb7-4283-8338-8df7160a978a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.264028 5008 generic.go:334] "Generic (PLEG): container finished" podID="e8a16325-dfb7-4283-8338-8df7160a978a" containerID="4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0" exitCode=0 Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.264099 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvt2w" event={"ID":"e8a16325-dfb7-4283-8338-8df7160a978a","Type":"ContainerDied","Data":"4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0"} Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.264145 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvt2w" event={"ID":"e8a16325-dfb7-4283-8338-8df7160a978a","Type":"ContainerDied","Data":"29eb17dbad509013efcc2551885aeeac7685300bd3b8469b3fa1112dfc234dff"} Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.264172 5008 scope.go:117] "RemoveContainer" containerID="4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.264169 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvt2w" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.295131 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvt2w"] Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.297428 5008 scope.go:117] "RemoveContainer" containerID="1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.302456 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hvt2w"] Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.325262 5008 scope.go:117] "RemoveContainer" containerID="2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.346845 5008 scope.go:117] "RemoveContainer" containerID="4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0" Mar 18 18:37:24 crc kubenswrapper[5008]: E0318 18:37:24.347471 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0\": container with ID starting with 4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0 not found: ID does not exist" containerID="4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.347515 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0"} err="failed to get container status \"4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0\": rpc error: code = NotFound desc = could not find container \"4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0\": container with ID starting with 4d6e7394d88ab10b2d8f442b5f9f56b7a23240f4fc825210947c11216443e5d0 not found: ID does not exist" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.347566 5008 scope.go:117] "RemoveContainer" containerID="1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23" Mar 18 18:37:24 crc kubenswrapper[5008]: E0318 18:37:24.347920 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23\": container with ID starting with 1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23 not found: ID does not exist" containerID="1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.347949 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23"} err="failed to get container status \"1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23\": rpc error: code = NotFound desc = could not find container \"1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23\": container with ID starting with 1cbbe1b0a94c5b442c44846c70e60ddb8c800d9c239980ed526a2395c46c8a23 not found: ID does not exist" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.347962 5008 scope.go:117] "RemoveContainer" containerID="2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa" Mar 18 18:37:24 crc kubenswrapper[5008]: E0318 18:37:24.348181 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa\": container with ID starting with 2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa not found: ID does not exist" containerID="2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa" Mar 18 18:37:24 crc kubenswrapper[5008]: I0318 18:37:24.348202 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa"} err="failed to get container status \"2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa\": rpc error: code = NotFound desc = could not find container \"2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa\": container with ID starting with 2c446bf8b306a088e77c8a5e9925d232cb148f4c6016c66c0323b8695ae941fa not found: ID does not exist" Mar 18 18:37:26 crc kubenswrapper[5008]: I0318 18:37:26.206515 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" path="/var/lib/kubelet/pods/e8a16325-dfb7-4283-8338-8df7160a978a/volumes" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.145121 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564318-28qhs"] Mar 18 18:38:00 crc kubenswrapper[5008]: E0318 18:38:00.146116 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerName="extract-content" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.146148 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerName="extract-content" Mar 18 18:38:00 crc kubenswrapper[5008]: E0318 18:38:00.146162 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" containerName="extract-utilities" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.146170 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" containerName="extract-utilities" Mar 18 18:38:00 crc kubenswrapper[5008]: E0318 18:38:00.146182 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" containerName="extract-content" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.146191 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" containerName="extract-content" Mar 18 18:38:00 crc kubenswrapper[5008]: E0318 18:38:00.146235 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerName="extract-utilities" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.146244 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerName="extract-utilities" Mar 18 18:38:00 crc kubenswrapper[5008]: E0318 18:38:00.146265 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" containerName="registry-server" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.146273 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" containerName="registry-server" Mar 18 18:38:00 crc kubenswrapper[5008]: E0318 18:38:00.146305 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerName="registry-server" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.146312 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerName="registry-server" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.146508 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a16325-dfb7-4283-8338-8df7160a978a" containerName="registry-server" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.146540 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e565e85-8023-4a2b-b989-4bec0108cbf5" containerName="registry-server" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.147328 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564318-28qhs" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.149611 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.150227 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.156658 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564318-28qhs"] Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.158817 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.162395 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7b8l\" (UniqueName: \"kubernetes.io/projected/03243287-f0db-4520-80e2-3455e2b45929-kube-api-access-s7b8l\") pod \"auto-csr-approver-29564318-28qhs\" (UID: \"03243287-f0db-4520-80e2-3455e2b45929\") " pod="openshift-infra/auto-csr-approver-29564318-28qhs" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.263411 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7b8l\" (UniqueName: \"kubernetes.io/projected/03243287-f0db-4520-80e2-3455e2b45929-kube-api-access-s7b8l\") pod \"auto-csr-approver-29564318-28qhs\" (UID: \"03243287-f0db-4520-80e2-3455e2b45929\") " pod="openshift-infra/auto-csr-approver-29564318-28qhs" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.288961 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7b8l\" (UniqueName: \"kubernetes.io/projected/03243287-f0db-4520-80e2-3455e2b45929-kube-api-access-s7b8l\") pod \"auto-csr-approver-29564318-28qhs\" (UID: \"03243287-f0db-4520-80e2-3455e2b45929\") " pod="openshift-infra/auto-csr-approver-29564318-28qhs" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.471293 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564318-28qhs" Mar 18 18:38:00 crc kubenswrapper[5008]: I0318 18:38:00.912508 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564318-28qhs"] Mar 18 18:38:01 crc kubenswrapper[5008]: I0318 18:38:01.610065 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564318-28qhs" event={"ID":"03243287-f0db-4520-80e2-3455e2b45929","Type":"ContainerStarted","Data":"a52cb5aa76c674d4877d1b16218ddbe839d7d758d99a048b018ca65b2f4639aa"} Mar 18 18:38:02 crc kubenswrapper[5008]: I0318 18:38:02.618705 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564318-28qhs" event={"ID":"03243287-f0db-4520-80e2-3455e2b45929","Type":"ContainerStarted","Data":"9d6fdc4b06404f194e085cebc4ca12629491e90f8452e50b0e4010f545bd61c3"} Mar 18 18:38:03 crc kubenswrapper[5008]: I0318 18:38:03.629145 5008 generic.go:334] "Generic (PLEG): container finished" podID="03243287-f0db-4520-80e2-3455e2b45929" containerID="9d6fdc4b06404f194e085cebc4ca12629491e90f8452e50b0e4010f545bd61c3" exitCode=0 Mar 18 18:38:03 crc kubenswrapper[5008]: I0318 18:38:03.629202 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564318-28qhs" event={"ID":"03243287-f0db-4520-80e2-3455e2b45929","Type":"ContainerDied","Data":"9d6fdc4b06404f194e085cebc4ca12629491e90f8452e50b0e4010f545bd61c3"} Mar 18 18:38:03 crc kubenswrapper[5008]: I0318 18:38:03.998189 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564318-28qhs" Mar 18 18:38:04 crc kubenswrapper[5008]: I0318 18:38:04.156713 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7b8l\" (UniqueName: \"kubernetes.io/projected/03243287-f0db-4520-80e2-3455e2b45929-kube-api-access-s7b8l\") pod \"03243287-f0db-4520-80e2-3455e2b45929\" (UID: \"03243287-f0db-4520-80e2-3455e2b45929\") " Mar 18 18:38:04 crc kubenswrapper[5008]: I0318 18:38:04.162341 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03243287-f0db-4520-80e2-3455e2b45929-kube-api-access-s7b8l" (OuterVolumeSpecName: "kube-api-access-s7b8l") pod "03243287-f0db-4520-80e2-3455e2b45929" (UID: "03243287-f0db-4520-80e2-3455e2b45929"). InnerVolumeSpecName "kube-api-access-s7b8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:38:04 crc kubenswrapper[5008]: I0318 18:38:04.259186 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7b8l\" (UniqueName: \"kubernetes.io/projected/03243287-f0db-4520-80e2-3455e2b45929-kube-api-access-s7b8l\") on node \"crc\" DevicePath \"\"" Mar 18 18:38:04 crc kubenswrapper[5008]: I0318 18:38:04.643667 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564318-28qhs" event={"ID":"03243287-f0db-4520-80e2-3455e2b45929","Type":"ContainerDied","Data":"a52cb5aa76c674d4877d1b16218ddbe839d7d758d99a048b018ca65b2f4639aa"} Mar 18 18:38:04 crc kubenswrapper[5008]: I0318 18:38:04.643713 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564318-28qhs" Mar 18 18:38:04 crc kubenswrapper[5008]: I0318 18:38:04.643713 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a52cb5aa76c674d4877d1b16218ddbe839d7d758d99a048b018ca65b2f4639aa" Mar 18 18:38:05 crc kubenswrapper[5008]: I0318 18:38:05.078204 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564312-gx7lw"] Mar 18 18:38:05 crc kubenswrapper[5008]: I0318 18:38:05.086180 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564312-gx7lw"] Mar 18 18:38:06 crc kubenswrapper[5008]: I0318 18:38:06.209999 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb7d7cb-b374-49f9-ac66-56c1229aa9f7" path="/var/lib/kubelet/pods/4cb7d7cb-b374-49f9-ac66-56c1229aa9f7/volumes" Mar 18 18:38:40 crc kubenswrapper[5008]: I0318 18:38:40.718387 5008 scope.go:117] "RemoveContainer" containerID="588423083d7b0d4e4a689a21050c1765b6c46f1c21c24744afe7b3f6aad7ff66" Mar 18 18:38:54 crc kubenswrapper[5008]: I0318 18:38:54.460254 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:38:54 crc kubenswrapper[5008]: I0318 18:38:54.460923 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:39:24 crc kubenswrapper[5008]: I0318 18:39:24.460806 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:39:24 crc kubenswrapper[5008]: I0318 18:39:24.461400 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:39:54 crc kubenswrapper[5008]: I0318 18:39:54.460641 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:39:54 crc kubenswrapper[5008]: I0318 18:39:54.461325 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:39:54 crc kubenswrapper[5008]: I0318 18:39:54.461398 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:39:54 crc kubenswrapper[5008]: I0318 18:39:54.462284 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:39:54 crc kubenswrapper[5008]: I0318 18:39:54.462380 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" gracePeriod=600 Mar 18 18:39:54 crc kubenswrapper[5008]: E0318 18:39:54.601314 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:39:54 crc kubenswrapper[5008]: I0318 18:39:54.793788 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" exitCode=0 Mar 18 18:39:54 crc kubenswrapper[5008]: I0318 18:39:54.793835 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c"} Mar 18 18:39:54 crc kubenswrapper[5008]: I0318 18:39:54.793892 5008 scope.go:117] "RemoveContainer" containerID="6a517949b9ff0064573ecb8a2b93943d6bc661b4978c01e5e21d16dfc8f19892" Mar 18 18:39:54 crc kubenswrapper[5008]: I0318 18:39:54.794524 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:39:54 crc kubenswrapper[5008]: E0318 18:39:54.794945 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.146697 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564320-c7lvf"] Mar 18 18:40:00 crc kubenswrapper[5008]: E0318 18:40:00.147310 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03243287-f0db-4520-80e2-3455e2b45929" containerName="oc" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.147328 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="03243287-f0db-4520-80e2-3455e2b45929" containerName="oc" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.147502 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="03243287-f0db-4520-80e2-3455e2b45929" containerName="oc" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.148018 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564320-c7lvf" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.150430 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.152775 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.155484 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.157685 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564320-c7lvf"] Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.208010 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhxgs\" (UniqueName: \"kubernetes.io/projected/7bf092a2-e99a-4970-883b-7dd6b0882993-kube-api-access-fhxgs\") pod \"auto-csr-approver-29564320-c7lvf\" (UID: \"7bf092a2-e99a-4970-883b-7dd6b0882993\") " pod="openshift-infra/auto-csr-approver-29564320-c7lvf" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.309650 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhxgs\" (UniqueName: \"kubernetes.io/projected/7bf092a2-e99a-4970-883b-7dd6b0882993-kube-api-access-fhxgs\") pod \"auto-csr-approver-29564320-c7lvf\" (UID: \"7bf092a2-e99a-4970-883b-7dd6b0882993\") " pod="openshift-infra/auto-csr-approver-29564320-c7lvf" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.328731 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhxgs\" (UniqueName: \"kubernetes.io/projected/7bf092a2-e99a-4970-883b-7dd6b0882993-kube-api-access-fhxgs\") pod \"auto-csr-approver-29564320-c7lvf\" (UID: \"7bf092a2-e99a-4970-883b-7dd6b0882993\") " pod="openshift-infra/auto-csr-approver-29564320-c7lvf" Mar 18 18:40:00 crc kubenswrapper[5008]: I0318 18:40:00.537952 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564320-c7lvf" Mar 18 18:40:01 crc kubenswrapper[5008]: I0318 18:40:01.136843 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564320-c7lvf"] Mar 18 18:40:01 crc kubenswrapper[5008]: W0318 18:40:01.164451 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bf092a2_e99a_4970_883b_7dd6b0882993.slice/crio-e28f12a6ce4f8cf6f9c314018370988a4c2c4f1f78bdcb56f2a24252808bf5c9 WatchSource:0}: Error finding container e28f12a6ce4f8cf6f9c314018370988a4c2c4f1f78bdcb56f2a24252808bf5c9: Status 404 returned error can't find the container with id e28f12a6ce4f8cf6f9c314018370988a4c2c4f1f78bdcb56f2a24252808bf5c9 Mar 18 18:40:01 crc kubenswrapper[5008]: I0318 18:40:01.167939 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:40:01 crc kubenswrapper[5008]: I0318 18:40:01.860922 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564320-c7lvf" event={"ID":"7bf092a2-e99a-4970-883b-7dd6b0882993","Type":"ContainerStarted","Data":"e28f12a6ce4f8cf6f9c314018370988a4c2c4f1f78bdcb56f2a24252808bf5c9"} Mar 18 18:40:02 crc kubenswrapper[5008]: I0318 18:40:02.879852 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564320-c7lvf" event={"ID":"7bf092a2-e99a-4970-883b-7dd6b0882993","Type":"ContainerStarted","Data":"52878deb5dfbd5274232081e5606ed9bf8bfa0547776301a35e0d6475204cc61"} Mar 18 18:40:02 crc kubenswrapper[5008]: I0318 18:40:02.904130 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564320-c7lvf" podStartSLOduration=1.762959819 podStartE2EDuration="2.904102948s" podCreationTimestamp="2026-03-18 18:40:00 +0000 UTC" firstStartedPulling="2026-03-18 18:40:01.167633308 +0000 UTC m=+2257.687106397" lastFinishedPulling="2026-03-18 18:40:02.308776407 +0000 UTC m=+2258.828249526" observedRunningTime="2026-03-18 18:40:02.89887934 +0000 UTC m=+2259.418352509" watchObservedRunningTime="2026-03-18 18:40:02.904102948 +0000 UTC m=+2259.423576047" Mar 18 18:40:03 crc kubenswrapper[5008]: I0318 18:40:03.894791 5008 generic.go:334] "Generic (PLEG): container finished" podID="7bf092a2-e99a-4970-883b-7dd6b0882993" containerID="52878deb5dfbd5274232081e5606ed9bf8bfa0547776301a35e0d6475204cc61" exitCode=0 Mar 18 18:40:03 crc kubenswrapper[5008]: I0318 18:40:03.894876 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564320-c7lvf" event={"ID":"7bf092a2-e99a-4970-883b-7dd6b0882993","Type":"ContainerDied","Data":"52878deb5dfbd5274232081e5606ed9bf8bfa0547776301a35e0d6475204cc61"} Mar 18 18:40:05 crc kubenswrapper[5008]: I0318 18:40:05.200073 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564320-c7lvf" Mar 18 18:40:05 crc kubenswrapper[5008]: I0318 18:40:05.299126 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhxgs\" (UniqueName: \"kubernetes.io/projected/7bf092a2-e99a-4970-883b-7dd6b0882993-kube-api-access-fhxgs\") pod \"7bf092a2-e99a-4970-883b-7dd6b0882993\" (UID: \"7bf092a2-e99a-4970-883b-7dd6b0882993\") " Mar 18 18:40:05 crc kubenswrapper[5008]: I0318 18:40:05.320771 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf092a2-e99a-4970-883b-7dd6b0882993-kube-api-access-fhxgs" (OuterVolumeSpecName: "kube-api-access-fhxgs") pod "7bf092a2-e99a-4970-883b-7dd6b0882993" (UID: "7bf092a2-e99a-4970-883b-7dd6b0882993"). InnerVolumeSpecName "kube-api-access-fhxgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:40:05 crc kubenswrapper[5008]: I0318 18:40:05.400822 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhxgs\" (UniqueName: \"kubernetes.io/projected/7bf092a2-e99a-4970-883b-7dd6b0882993-kube-api-access-fhxgs\") on node \"crc\" DevicePath \"\"" Mar 18 18:40:06 crc kubenswrapper[5008]: I0318 18:40:05.909900 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564320-c7lvf" event={"ID":"7bf092a2-e99a-4970-883b-7dd6b0882993","Type":"ContainerDied","Data":"e28f12a6ce4f8cf6f9c314018370988a4c2c4f1f78bdcb56f2a24252808bf5c9"} Mar 18 18:40:06 crc kubenswrapper[5008]: I0318 18:40:05.909931 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e28f12a6ce4f8cf6f9c314018370988a4c2c4f1f78bdcb56f2a24252808bf5c9" Mar 18 18:40:06 crc kubenswrapper[5008]: I0318 18:40:05.909984 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564320-c7lvf" Mar 18 18:40:06 crc kubenswrapper[5008]: I0318 18:40:05.997984 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564314-zccgd"] Mar 18 18:40:06 crc kubenswrapper[5008]: I0318 18:40:06.005068 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564314-zccgd"] Mar 18 18:40:06 crc kubenswrapper[5008]: I0318 18:40:06.206368 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1465280-b484-42fc-9162-101f8fb8f308" path="/var/lib/kubelet/pods/d1465280-b484-42fc-9162-101f8fb8f308/volumes" Mar 18 18:40:08 crc kubenswrapper[5008]: I0318 18:40:08.198095 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:40:08 crc kubenswrapper[5008]: E0318 18:40:08.198300 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:40:20 crc kubenswrapper[5008]: I0318 18:40:20.199259 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:40:20 crc kubenswrapper[5008]: E0318 18:40:20.200272 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:40:32 crc kubenswrapper[5008]: I0318 18:40:32.199357 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:40:32 crc kubenswrapper[5008]: E0318 18:40:32.200732 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:40:40 crc kubenswrapper[5008]: I0318 18:40:40.845914 5008 scope.go:117] "RemoveContainer" containerID="eda1ccd22e49549bc71e9b1bd2a7f687ce2d7cb44a4eb2cba1b9aaa287224bd5" Mar 18 18:40:43 crc kubenswrapper[5008]: I0318 18:40:43.200474 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:40:43 crc kubenswrapper[5008]: E0318 18:40:43.201192 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:40:58 crc kubenswrapper[5008]: I0318 18:40:58.199959 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:40:58 crc kubenswrapper[5008]: E0318 18:40:58.200829 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:41:10 crc kubenswrapper[5008]: I0318 18:41:10.198715 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:41:10 crc kubenswrapper[5008]: E0318 18:41:10.199938 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.612712 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k5j47"] Mar 18 18:41:12 crc kubenswrapper[5008]: E0318 18:41:12.613784 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf092a2-e99a-4970-883b-7dd6b0882993" containerName="oc" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.613818 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf092a2-e99a-4970-883b-7dd6b0882993" containerName="oc" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.614148 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf092a2-e99a-4970-883b-7dd6b0882993" containerName="oc" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.648049 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5j47"] Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.648193 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.778169 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-utilities\") pod \"redhat-marketplace-k5j47\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.778294 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-catalog-content\") pod \"redhat-marketplace-k5j47\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.778386 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlhzj\" (UniqueName: \"kubernetes.io/projected/73a0476c-1b12-46b9-9be3-727532d4ab5f-kube-api-access-hlhzj\") pod \"redhat-marketplace-k5j47\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.879364 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlhzj\" (UniqueName: \"kubernetes.io/projected/73a0476c-1b12-46b9-9be3-727532d4ab5f-kube-api-access-hlhzj\") pod \"redhat-marketplace-k5j47\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.879451 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-utilities\") pod \"redhat-marketplace-k5j47\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.879483 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-catalog-content\") pod \"redhat-marketplace-k5j47\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.879900 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-catalog-content\") pod \"redhat-marketplace-k5j47\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.880109 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-utilities\") pod \"redhat-marketplace-k5j47\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.904720 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlhzj\" (UniqueName: \"kubernetes.io/projected/73a0476c-1b12-46b9-9be3-727532d4ab5f-kube-api-access-hlhzj\") pod \"redhat-marketplace-k5j47\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:12 crc kubenswrapper[5008]: I0318 18:41:12.984448 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:13 crc kubenswrapper[5008]: I0318 18:41:13.409187 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5j47"] Mar 18 18:41:13 crc kubenswrapper[5008]: I0318 18:41:13.521960 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5j47" event={"ID":"73a0476c-1b12-46b9-9be3-727532d4ab5f","Type":"ContainerStarted","Data":"43241a16d384a63d1c1f539223e64320d1a5cea435a0df7a7f9ddad09d063da8"} Mar 18 18:41:14 crc kubenswrapper[5008]: I0318 18:41:14.531843 5008 generic.go:334] "Generic (PLEG): container finished" podID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerID="b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2" exitCode=0 Mar 18 18:41:14 crc kubenswrapper[5008]: I0318 18:41:14.531911 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5j47" event={"ID":"73a0476c-1b12-46b9-9be3-727532d4ab5f","Type":"ContainerDied","Data":"b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2"} Mar 18 18:41:15 crc kubenswrapper[5008]: I0318 18:41:15.542187 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5j47" event={"ID":"73a0476c-1b12-46b9-9be3-727532d4ab5f","Type":"ContainerStarted","Data":"4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35"} Mar 18 18:41:16 crc kubenswrapper[5008]: I0318 18:41:16.554123 5008 generic.go:334] "Generic (PLEG): container finished" podID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerID="4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35" exitCode=0 Mar 18 18:41:16 crc kubenswrapper[5008]: I0318 18:41:16.554194 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5j47" event={"ID":"73a0476c-1b12-46b9-9be3-727532d4ab5f","Type":"ContainerDied","Data":"4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35"} Mar 18 18:41:17 crc kubenswrapper[5008]: I0318 18:41:17.565744 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5j47" event={"ID":"73a0476c-1b12-46b9-9be3-727532d4ab5f","Type":"ContainerStarted","Data":"a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd"} Mar 18 18:41:17 crc kubenswrapper[5008]: I0318 18:41:17.591715 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k5j47" podStartSLOduration=3.070113378 podStartE2EDuration="5.59169235s" podCreationTimestamp="2026-03-18 18:41:12 +0000 UTC" firstStartedPulling="2026-03-18 18:41:14.53376062 +0000 UTC m=+2331.053233739" lastFinishedPulling="2026-03-18 18:41:17.055339592 +0000 UTC m=+2333.574812711" observedRunningTime="2026-03-18 18:41:17.586671437 +0000 UTC m=+2334.106144556" watchObservedRunningTime="2026-03-18 18:41:17.59169235 +0000 UTC m=+2334.111165469" Mar 18 18:41:22 crc kubenswrapper[5008]: I0318 18:41:22.985634 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:22 crc kubenswrapper[5008]: I0318 18:41:22.986842 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:23 crc kubenswrapper[5008]: I0318 18:41:23.065889 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:23 crc kubenswrapper[5008]: I0318 18:41:23.198755 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:41:23 crc kubenswrapper[5008]: E0318 18:41:23.199018 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:41:23 crc kubenswrapper[5008]: I0318 18:41:23.691970 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:23 crc kubenswrapper[5008]: I0318 18:41:23.769524 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5j47"] Mar 18 18:41:25 crc kubenswrapper[5008]: I0318 18:41:25.635141 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k5j47" podUID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerName="registry-server" containerID="cri-o://a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd" gracePeriod=2 Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.107004 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.288682 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-utilities\") pod \"73a0476c-1b12-46b9-9be3-727532d4ab5f\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.288841 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-catalog-content\") pod \"73a0476c-1b12-46b9-9be3-727532d4ab5f\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.289099 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlhzj\" (UniqueName: \"kubernetes.io/projected/73a0476c-1b12-46b9-9be3-727532d4ab5f-kube-api-access-hlhzj\") pod \"73a0476c-1b12-46b9-9be3-727532d4ab5f\" (UID: \"73a0476c-1b12-46b9-9be3-727532d4ab5f\") " Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.290336 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-utilities" (OuterVolumeSpecName: "utilities") pod "73a0476c-1b12-46b9-9be3-727532d4ab5f" (UID: "73a0476c-1b12-46b9-9be3-727532d4ab5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.301778 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a0476c-1b12-46b9-9be3-727532d4ab5f-kube-api-access-hlhzj" (OuterVolumeSpecName: "kube-api-access-hlhzj") pod "73a0476c-1b12-46b9-9be3-727532d4ab5f" (UID: "73a0476c-1b12-46b9-9be3-727532d4ab5f"). InnerVolumeSpecName "kube-api-access-hlhzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.340952 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73a0476c-1b12-46b9-9be3-727532d4ab5f" (UID: "73a0476c-1b12-46b9-9be3-727532d4ab5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.390570 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.390616 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a0476c-1b12-46b9-9be3-727532d4ab5f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.390626 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlhzj\" (UniqueName: \"kubernetes.io/projected/73a0476c-1b12-46b9-9be3-727532d4ab5f-kube-api-access-hlhzj\") on node \"crc\" DevicePath \"\"" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.647154 5008 generic.go:334] "Generic (PLEG): container finished" podID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerID="a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd" exitCode=0 Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.647215 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5j47" event={"ID":"73a0476c-1b12-46b9-9be3-727532d4ab5f","Type":"ContainerDied","Data":"a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd"} Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.647253 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5j47" event={"ID":"73a0476c-1b12-46b9-9be3-727532d4ab5f","Type":"ContainerDied","Data":"43241a16d384a63d1c1f539223e64320d1a5cea435a0df7a7f9ddad09d063da8"} Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.647282 5008 scope.go:117] "RemoveContainer" containerID="a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.647543 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5j47" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.671635 5008 scope.go:117] "RemoveContainer" containerID="4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.689474 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5j47"] Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.694362 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5j47"] Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.722824 5008 scope.go:117] "RemoveContainer" containerID="b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.740877 5008 scope.go:117] "RemoveContainer" containerID="a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd" Mar 18 18:41:26 crc kubenswrapper[5008]: E0318 18:41:26.741281 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd\": container with ID starting with a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd not found: ID does not exist" containerID="a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.741321 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd"} err="failed to get container status \"a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd\": rpc error: code = NotFound desc = could not find container \"a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd\": container with ID starting with a797e23dffb7fa5cda359ac08f0caf356daf54120aac88e0108332ef28d4d5bd not found: ID does not exist" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.741348 5008 scope.go:117] "RemoveContainer" containerID="4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35" Mar 18 18:41:26 crc kubenswrapper[5008]: E0318 18:41:26.741676 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35\": container with ID starting with 4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35 not found: ID does not exist" containerID="4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.741698 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35"} err="failed to get container status \"4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35\": rpc error: code = NotFound desc = could not find container \"4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35\": container with ID starting with 4031487a577cfe157dabfa3817d94037cd5e105fea0da808556ef7d26320cb35 not found: ID does not exist" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.741714 5008 scope.go:117] "RemoveContainer" containerID="b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2" Mar 18 18:41:26 crc kubenswrapper[5008]: E0318 18:41:26.742049 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2\": container with ID starting with b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2 not found: ID does not exist" containerID="b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2" Mar 18 18:41:26 crc kubenswrapper[5008]: I0318 18:41:26.742075 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2"} err="failed to get container status \"b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2\": rpc error: code = NotFound desc = could not find container \"b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2\": container with ID starting with b73bb57adb858d8e5b9ea41381309b77ad0b7d79ac0ae70fd4e705a0226eeca2 not found: ID does not exist" Mar 18 18:41:28 crc kubenswrapper[5008]: I0318 18:41:28.208941 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a0476c-1b12-46b9-9be3-727532d4ab5f" path="/var/lib/kubelet/pods/73a0476c-1b12-46b9-9be3-727532d4ab5f/volumes" Mar 18 18:41:36 crc kubenswrapper[5008]: I0318 18:41:36.198758 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:41:36 crc kubenswrapper[5008]: E0318 18:41:36.199657 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:41:50 crc kubenswrapper[5008]: I0318 18:41:50.198775 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:41:50 crc kubenswrapper[5008]: E0318 18:41:50.199688 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.150004 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564322-dlfpn"] Mar 18 18:42:00 crc kubenswrapper[5008]: E0318 18:42:00.151088 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerName="registry-server" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.151112 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerName="registry-server" Mar 18 18:42:00 crc kubenswrapper[5008]: E0318 18:42:00.151147 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerName="extract-utilities" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.151159 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerName="extract-utilities" Mar 18 18:42:00 crc kubenswrapper[5008]: E0318 18:42:00.151176 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerName="extract-content" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.151185 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerName="extract-content" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.151436 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a0476c-1b12-46b9-9be3-727532d4ab5f" containerName="registry-server" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.152161 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564322-dlfpn" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.155449 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.155461 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.155523 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.161747 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564322-dlfpn"] Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.270124 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9k82\" (UniqueName: \"kubernetes.io/projected/6e29c6b1-9049-47ea-97a6-c1f00f42ebad-kube-api-access-c9k82\") pod \"auto-csr-approver-29564322-dlfpn\" (UID: \"6e29c6b1-9049-47ea-97a6-c1f00f42ebad\") " pod="openshift-infra/auto-csr-approver-29564322-dlfpn" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.371838 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9k82\" (UniqueName: \"kubernetes.io/projected/6e29c6b1-9049-47ea-97a6-c1f00f42ebad-kube-api-access-c9k82\") pod \"auto-csr-approver-29564322-dlfpn\" (UID: \"6e29c6b1-9049-47ea-97a6-c1f00f42ebad\") " pod="openshift-infra/auto-csr-approver-29564322-dlfpn" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.411521 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9k82\" (UniqueName: \"kubernetes.io/projected/6e29c6b1-9049-47ea-97a6-c1f00f42ebad-kube-api-access-c9k82\") pod \"auto-csr-approver-29564322-dlfpn\" (UID: \"6e29c6b1-9049-47ea-97a6-c1f00f42ebad\") " pod="openshift-infra/auto-csr-approver-29564322-dlfpn" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.473668 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564322-dlfpn" Mar 18 18:42:00 crc kubenswrapper[5008]: I0318 18:42:00.986980 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564322-dlfpn"] Mar 18 18:42:01 crc kubenswrapper[5008]: I0318 18:42:01.995493 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564322-dlfpn" event={"ID":"6e29c6b1-9049-47ea-97a6-c1f00f42ebad","Type":"ContainerStarted","Data":"417655bfa8dbf025a86bb31eb8c14b416f7e26d61f1c409fd1245ac9d44c2ffe"} Mar 18 18:42:04 crc kubenswrapper[5008]: I0318 18:42:04.015323 5008 generic.go:334] "Generic (PLEG): container finished" podID="6e29c6b1-9049-47ea-97a6-c1f00f42ebad" containerID="f34960220ec0a7d60fe64f84aaa86216309edcf97ff42bebb28c2b5d506267dc" exitCode=0 Mar 18 18:42:04 crc kubenswrapper[5008]: I0318 18:42:04.015641 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564322-dlfpn" event={"ID":"6e29c6b1-9049-47ea-97a6-c1f00f42ebad","Type":"ContainerDied","Data":"f34960220ec0a7d60fe64f84aaa86216309edcf97ff42bebb28c2b5d506267dc"} Mar 18 18:42:05 crc kubenswrapper[5008]: I0318 18:42:05.198811 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:42:05 crc kubenswrapper[5008]: E0318 18:42:05.199034 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:42:05 crc kubenswrapper[5008]: I0318 18:42:05.324486 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564322-dlfpn" Mar 18 18:42:05 crc kubenswrapper[5008]: I0318 18:42:05.497843 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9k82\" (UniqueName: \"kubernetes.io/projected/6e29c6b1-9049-47ea-97a6-c1f00f42ebad-kube-api-access-c9k82\") pod \"6e29c6b1-9049-47ea-97a6-c1f00f42ebad\" (UID: \"6e29c6b1-9049-47ea-97a6-c1f00f42ebad\") " Mar 18 18:42:05 crc kubenswrapper[5008]: I0318 18:42:05.502535 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e29c6b1-9049-47ea-97a6-c1f00f42ebad-kube-api-access-c9k82" (OuterVolumeSpecName: "kube-api-access-c9k82") pod "6e29c6b1-9049-47ea-97a6-c1f00f42ebad" (UID: "6e29c6b1-9049-47ea-97a6-c1f00f42ebad"). InnerVolumeSpecName "kube-api-access-c9k82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:42:05 crc kubenswrapper[5008]: I0318 18:42:05.600908 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9k82\" (UniqueName: \"kubernetes.io/projected/6e29c6b1-9049-47ea-97a6-c1f00f42ebad-kube-api-access-c9k82\") on node \"crc\" DevicePath \"\"" Mar 18 18:42:06 crc kubenswrapper[5008]: I0318 18:42:06.032461 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564322-dlfpn" event={"ID":"6e29c6b1-9049-47ea-97a6-c1f00f42ebad","Type":"ContainerDied","Data":"417655bfa8dbf025a86bb31eb8c14b416f7e26d61f1c409fd1245ac9d44c2ffe"} Mar 18 18:42:06 crc kubenswrapper[5008]: I0318 18:42:06.032510 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="417655bfa8dbf025a86bb31eb8c14b416f7e26d61f1c409fd1245ac9d44c2ffe" Mar 18 18:42:06 crc kubenswrapper[5008]: I0318 18:42:06.032525 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564322-dlfpn" Mar 18 18:42:06 crc kubenswrapper[5008]: I0318 18:42:06.417272 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564316-z8lsc"] Mar 18 18:42:06 crc kubenswrapper[5008]: I0318 18:42:06.425610 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564316-z8lsc"] Mar 18 18:42:08 crc kubenswrapper[5008]: I0318 18:42:08.207590 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16726099-fb12-4782-ae36-7bd417eaac45" path="/var/lib/kubelet/pods/16726099-fb12-4782-ae36-7bd417eaac45/volumes" Mar 18 18:42:16 crc kubenswrapper[5008]: I0318 18:42:16.201620 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:42:16 crc kubenswrapper[5008]: E0318 18:42:16.204117 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:42:27 crc kubenswrapper[5008]: I0318 18:42:27.197822 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:42:27 crc kubenswrapper[5008]: E0318 18:42:27.198525 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.346344 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lsfdb"] Mar 18 18:42:39 crc kubenswrapper[5008]: E0318 18:42:39.348796 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e29c6b1-9049-47ea-97a6-c1f00f42ebad" containerName="oc" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.348836 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e29c6b1-9049-47ea-97a6-c1f00f42ebad" containerName="oc" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.349800 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e29c6b1-9049-47ea-97a6-c1f00f42ebad" containerName="oc" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.354381 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.382734 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lsfdb"] Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.443001 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-utilities\") pod \"certified-operators-lsfdb\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.443101 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr5rq\" (UniqueName: \"kubernetes.io/projected/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-kube-api-access-vr5rq\") pod \"certified-operators-lsfdb\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.443130 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-catalog-content\") pod \"certified-operators-lsfdb\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.544227 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-utilities\") pod \"certified-operators-lsfdb\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.544634 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr5rq\" (UniqueName: \"kubernetes.io/projected/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-kube-api-access-vr5rq\") pod \"certified-operators-lsfdb\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.544667 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-catalog-content\") pod \"certified-operators-lsfdb\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.545154 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-utilities\") pod \"certified-operators-lsfdb\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.545171 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-catalog-content\") pod \"certified-operators-lsfdb\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.570061 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr5rq\" (UniqueName: \"kubernetes.io/projected/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-kube-api-access-vr5rq\") pod \"certified-operators-lsfdb\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:39 crc kubenswrapper[5008]: I0318 18:42:39.702726 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:40 crc kubenswrapper[5008]: I0318 18:42:40.189919 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lsfdb"] Mar 18 18:42:40 crc kubenswrapper[5008]: I0318 18:42:40.351852 5008 generic.go:334] "Generic (PLEG): container finished" podID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerID="330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1" exitCode=0 Mar 18 18:42:40 crc kubenswrapper[5008]: I0318 18:42:40.351896 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfdb" event={"ID":"6b3d84b7-43d2-47c1-8c14-8b734bc40f54","Type":"ContainerDied","Data":"330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1"} Mar 18 18:42:40 crc kubenswrapper[5008]: I0318 18:42:40.351964 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfdb" event={"ID":"6b3d84b7-43d2-47c1-8c14-8b734bc40f54","Type":"ContainerStarted","Data":"621cac31a0163a88a49ea836cf7d8ad7c5b0099f8a954dad68eecc81bef1f6ab"} Mar 18 18:42:40 crc kubenswrapper[5008]: I0318 18:42:40.960838 5008 scope.go:117] "RemoveContainer" containerID="27d10a4f92b5966695a1e0d5444ab0aadf418f9d504e9af1fac6848f552933ac" Mar 18 18:42:41 crc kubenswrapper[5008]: I0318 18:42:41.204315 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:42:41 crc kubenswrapper[5008]: E0318 18:42:41.204966 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:42:42 crc kubenswrapper[5008]: I0318 18:42:42.371225 5008 generic.go:334] "Generic (PLEG): container finished" podID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerID="06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2" exitCode=0 Mar 18 18:42:42 crc kubenswrapper[5008]: I0318 18:42:42.371282 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfdb" event={"ID":"6b3d84b7-43d2-47c1-8c14-8b734bc40f54","Type":"ContainerDied","Data":"06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2"} Mar 18 18:42:43 crc kubenswrapper[5008]: I0318 18:42:43.381460 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfdb" event={"ID":"6b3d84b7-43d2-47c1-8c14-8b734bc40f54","Type":"ContainerStarted","Data":"83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f"} Mar 18 18:42:43 crc kubenswrapper[5008]: I0318 18:42:43.401443 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lsfdb" podStartSLOduration=1.8933443429999999 podStartE2EDuration="4.401424027s" podCreationTimestamp="2026-03-18 18:42:39 +0000 UTC" firstStartedPulling="2026-03-18 18:42:40.353048261 +0000 UTC m=+2416.872521340" lastFinishedPulling="2026-03-18 18:42:42.861127935 +0000 UTC m=+2419.380601024" observedRunningTime="2026-03-18 18:42:43.399212338 +0000 UTC m=+2419.918685437" watchObservedRunningTime="2026-03-18 18:42:43.401424027 +0000 UTC m=+2419.920897106" Mar 18 18:42:49 crc kubenswrapper[5008]: I0318 18:42:49.704039 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:49 crc kubenswrapper[5008]: I0318 18:42:49.704636 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:49 crc kubenswrapper[5008]: I0318 18:42:49.780032 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:50 crc kubenswrapper[5008]: I0318 18:42:50.501045 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:50 crc kubenswrapper[5008]: I0318 18:42:50.567245 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lsfdb"] Mar 18 18:42:52 crc kubenswrapper[5008]: I0318 18:42:52.453400 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lsfdb" podUID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerName="registry-server" containerID="cri-o://83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f" gracePeriod=2 Mar 18 18:42:52 crc kubenswrapper[5008]: I0318 18:42:52.974038 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.086450 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-catalog-content\") pod \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.086586 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr5rq\" (UniqueName: \"kubernetes.io/projected/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-kube-api-access-vr5rq\") pod \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.086624 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-utilities\") pod \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\" (UID: \"6b3d84b7-43d2-47c1-8c14-8b734bc40f54\") " Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.087490 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-utilities" (OuterVolumeSpecName: "utilities") pod "6b3d84b7-43d2-47c1-8c14-8b734bc40f54" (UID: "6b3d84b7-43d2-47c1-8c14-8b734bc40f54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.095719 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-kube-api-access-vr5rq" (OuterVolumeSpecName: "kube-api-access-vr5rq") pod "6b3d84b7-43d2-47c1-8c14-8b734bc40f54" (UID: "6b3d84b7-43d2-47c1-8c14-8b734bc40f54"). InnerVolumeSpecName "kube-api-access-vr5rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.149001 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b3d84b7-43d2-47c1-8c14-8b734bc40f54" (UID: "6b3d84b7-43d2-47c1-8c14-8b734bc40f54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.188384 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.188415 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.188429 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr5rq\" (UniqueName: \"kubernetes.io/projected/6b3d84b7-43d2-47c1-8c14-8b734bc40f54-kube-api-access-vr5rq\") on node \"crc\" DevicePath \"\"" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.462933 5008 generic.go:334] "Generic (PLEG): container finished" podID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerID="83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f" exitCode=0 Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.462983 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsfdb" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.463009 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfdb" event={"ID":"6b3d84b7-43d2-47c1-8c14-8b734bc40f54","Type":"ContainerDied","Data":"83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f"} Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.463061 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfdb" event={"ID":"6b3d84b7-43d2-47c1-8c14-8b734bc40f54","Type":"ContainerDied","Data":"621cac31a0163a88a49ea836cf7d8ad7c5b0099f8a954dad68eecc81bef1f6ab"} Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.463090 5008 scope.go:117] "RemoveContainer" containerID="83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.490413 5008 scope.go:117] "RemoveContainer" containerID="06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.505044 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lsfdb"] Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.523666 5008 scope.go:117] "RemoveContainer" containerID="330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.524939 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lsfdb"] Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.614143 5008 scope.go:117] "RemoveContainer" containerID="83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f" Mar 18 18:42:53 crc kubenswrapper[5008]: E0318 18:42:53.614648 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f\": container with ID starting with 83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f not found: ID does not exist" containerID="83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.614676 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f"} err="failed to get container status \"83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f\": rpc error: code = NotFound desc = could not find container \"83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f\": container with ID starting with 83ab3763bda6e586ad849da2fc735f7ae7ef5301cb6eb2afb8a43a6e7cf2d16f not found: ID does not exist" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.614695 5008 scope.go:117] "RemoveContainer" containerID="06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2" Mar 18 18:42:53 crc kubenswrapper[5008]: E0318 18:42:53.614966 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2\": container with ID starting with 06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2 not found: ID does not exist" containerID="06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.614991 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2"} err="failed to get container status \"06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2\": rpc error: code = NotFound desc = could not find container \"06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2\": container with ID starting with 06df70d40639c4f411be973cfaace65bd4e487241e0386e582f7eacf0d1e15b2 not found: ID does not exist" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.615005 5008 scope.go:117] "RemoveContainer" containerID="330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1" Mar 18 18:42:53 crc kubenswrapper[5008]: E0318 18:42:53.615510 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1\": container with ID starting with 330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1 not found: ID does not exist" containerID="330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1" Mar 18 18:42:53 crc kubenswrapper[5008]: I0318 18:42:53.615530 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1"} err="failed to get container status \"330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1\": rpc error: code = NotFound desc = could not find container \"330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1\": container with ID starting with 330ee6bdb710dd592aabc368c46f0498d649bf0a65ca410a6a462998743baea1 not found: ID does not exist" Mar 18 18:42:54 crc kubenswrapper[5008]: I0318 18:42:54.203353 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:42:54 crc kubenswrapper[5008]: E0318 18:42:54.203631 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:42:54 crc kubenswrapper[5008]: I0318 18:42:54.207409 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" path="/var/lib/kubelet/pods/6b3d84b7-43d2-47c1-8c14-8b734bc40f54/volumes" Mar 18 18:43:08 crc kubenswrapper[5008]: I0318 18:43:08.198700 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:43:08 crc kubenswrapper[5008]: E0318 18:43:08.199426 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:43:23 crc kubenswrapper[5008]: I0318 18:43:23.199149 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:43:23 crc kubenswrapper[5008]: E0318 18:43:23.200342 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:43:34 crc kubenswrapper[5008]: I0318 18:43:34.208135 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:43:34 crc kubenswrapper[5008]: E0318 18:43:34.209354 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:43:48 crc kubenswrapper[5008]: I0318 18:43:48.199031 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:43:48 crc kubenswrapper[5008]: E0318 18:43:48.200224 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.161049 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564324-9hf9w"] Mar 18 18:44:00 crc kubenswrapper[5008]: E0318 18:44:00.162183 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerName="extract-content" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.162204 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerName="extract-content" Mar 18 18:44:00 crc kubenswrapper[5008]: E0318 18:44:00.162225 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerName="registry-server" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.162236 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerName="registry-server" Mar 18 18:44:00 crc kubenswrapper[5008]: E0318 18:44:00.162260 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerName="extract-utilities" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.162274 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerName="extract-utilities" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.162511 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3d84b7-43d2-47c1-8c14-8b734bc40f54" containerName="registry-server" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.163216 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564324-9hf9w" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.166620 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.172968 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.173931 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564324-9hf9w"] Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.173064 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.200639 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:44:00 crc kubenswrapper[5008]: E0318 18:44:00.201003 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.296012 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/6b7503e3-be08-46e5-9651-5acfe48f9cb1-kube-api-access-7ds2n\") pod \"auto-csr-approver-29564324-9hf9w\" (UID: \"6b7503e3-be08-46e5-9651-5acfe48f9cb1\") " pod="openshift-infra/auto-csr-approver-29564324-9hf9w" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.400694 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/6b7503e3-be08-46e5-9651-5acfe48f9cb1-kube-api-access-7ds2n\") pod \"auto-csr-approver-29564324-9hf9w\" (UID: \"6b7503e3-be08-46e5-9651-5acfe48f9cb1\") " pod="openshift-infra/auto-csr-approver-29564324-9hf9w" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.436548 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/6b7503e3-be08-46e5-9651-5acfe48f9cb1-kube-api-access-7ds2n\") pod \"auto-csr-approver-29564324-9hf9w\" (UID: \"6b7503e3-be08-46e5-9651-5acfe48f9cb1\") " pod="openshift-infra/auto-csr-approver-29564324-9hf9w" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.497058 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564324-9hf9w" Mar 18 18:44:00 crc kubenswrapper[5008]: I0318 18:44:00.748717 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564324-9hf9w"] Mar 18 18:44:01 crc kubenswrapper[5008]: I0318 18:44:01.071033 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564324-9hf9w" event={"ID":"6b7503e3-be08-46e5-9651-5acfe48f9cb1","Type":"ContainerStarted","Data":"11b3cb8437b54352ac0e377edb0e243fec018b93cfc1b68d2d73458c7661deb3"} Mar 18 18:44:03 crc kubenswrapper[5008]: I0318 18:44:03.087547 5008 generic.go:334] "Generic (PLEG): container finished" podID="6b7503e3-be08-46e5-9651-5acfe48f9cb1" containerID="66f7e0158677c710b9ba8f2a71888418d2926ea46dff65484195eefb67dc7626" exitCode=0 Mar 18 18:44:03 crc kubenswrapper[5008]: I0318 18:44:03.087779 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564324-9hf9w" event={"ID":"6b7503e3-be08-46e5-9651-5acfe48f9cb1","Type":"ContainerDied","Data":"66f7e0158677c710b9ba8f2a71888418d2926ea46dff65484195eefb67dc7626"} Mar 18 18:44:04 crc kubenswrapper[5008]: I0318 18:44:04.433108 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564324-9hf9w" Mar 18 18:44:04 crc kubenswrapper[5008]: I0318 18:44:04.570630 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/6b7503e3-be08-46e5-9651-5acfe48f9cb1-kube-api-access-7ds2n\") pod \"6b7503e3-be08-46e5-9651-5acfe48f9cb1\" (UID: \"6b7503e3-be08-46e5-9651-5acfe48f9cb1\") " Mar 18 18:44:04 crc kubenswrapper[5008]: I0318 18:44:04.577782 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b7503e3-be08-46e5-9651-5acfe48f9cb1-kube-api-access-7ds2n" (OuterVolumeSpecName: "kube-api-access-7ds2n") pod "6b7503e3-be08-46e5-9651-5acfe48f9cb1" (UID: "6b7503e3-be08-46e5-9651-5acfe48f9cb1"). InnerVolumeSpecName "kube-api-access-7ds2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:44:04 crc kubenswrapper[5008]: I0318 18:44:04.673591 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/6b7503e3-be08-46e5-9651-5acfe48f9cb1-kube-api-access-7ds2n\") on node \"crc\" DevicePath \"\"" Mar 18 18:44:05 crc kubenswrapper[5008]: I0318 18:44:05.104716 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564324-9hf9w" event={"ID":"6b7503e3-be08-46e5-9651-5acfe48f9cb1","Type":"ContainerDied","Data":"11b3cb8437b54352ac0e377edb0e243fec018b93cfc1b68d2d73458c7661deb3"} Mar 18 18:44:05 crc kubenswrapper[5008]: I0318 18:44:05.104772 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11b3cb8437b54352ac0e377edb0e243fec018b93cfc1b68d2d73458c7661deb3" Mar 18 18:44:05 crc kubenswrapper[5008]: I0318 18:44:05.104843 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564324-9hf9w" Mar 18 18:44:05 crc kubenswrapper[5008]: I0318 18:44:05.522083 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564318-28qhs"] Mar 18 18:44:05 crc kubenswrapper[5008]: I0318 18:44:05.526838 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564318-28qhs"] Mar 18 18:44:06 crc kubenswrapper[5008]: I0318 18:44:06.210152 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03243287-f0db-4520-80e2-3455e2b45929" path="/var/lib/kubelet/pods/03243287-f0db-4520-80e2-3455e2b45929/volumes" Mar 18 18:44:14 crc kubenswrapper[5008]: I0318 18:44:14.206955 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:44:14 crc kubenswrapper[5008]: E0318 18:44:14.209998 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:44:26 crc kubenswrapper[5008]: I0318 18:44:26.198780 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:44:26 crc kubenswrapper[5008]: E0318 18:44:26.199797 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:44:41 crc kubenswrapper[5008]: I0318 18:44:41.074732 5008 scope.go:117] "RemoveContainer" containerID="9d6fdc4b06404f194e085cebc4ca12629491e90f8452e50b0e4010f545bd61c3" Mar 18 18:44:41 crc kubenswrapper[5008]: I0318 18:44:41.198878 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:44:41 crc kubenswrapper[5008]: E0318 18:44:41.199165 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:44:52 crc kubenswrapper[5008]: I0318 18:44:52.198671 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:44:52 crc kubenswrapper[5008]: E0318 18:44:52.199755 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.166881 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx"] Mar 18 18:45:00 crc kubenswrapper[5008]: E0318 18:45:00.168510 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7503e3-be08-46e5-9651-5acfe48f9cb1" containerName="oc" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.168545 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7503e3-be08-46e5-9651-5acfe48f9cb1" containerName="oc" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.168989 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7503e3-be08-46e5-9651-5acfe48f9cb1" containerName="oc" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.170092 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.173079 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx"] Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.173426 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.173530 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.193628 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb9175af-b974-4766-9c81-ffe6765a2099-secret-volume\") pod \"collect-profiles-29564325-zjpkx\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.193671 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7km6\" (UniqueName: \"kubernetes.io/projected/eb9175af-b974-4766-9c81-ffe6765a2099-kube-api-access-f7km6\") pod \"collect-profiles-29564325-zjpkx\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.193717 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb9175af-b974-4766-9c81-ffe6765a2099-config-volume\") pod \"collect-profiles-29564325-zjpkx\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.295375 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7km6\" (UniqueName: \"kubernetes.io/projected/eb9175af-b974-4766-9c81-ffe6765a2099-kube-api-access-f7km6\") pod \"collect-profiles-29564325-zjpkx\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.295858 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb9175af-b974-4766-9c81-ffe6765a2099-secret-volume\") pod \"collect-profiles-29564325-zjpkx\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.296004 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb9175af-b974-4766-9c81-ffe6765a2099-config-volume\") pod \"collect-profiles-29564325-zjpkx\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.296711 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb9175af-b974-4766-9c81-ffe6765a2099-config-volume\") pod \"collect-profiles-29564325-zjpkx\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.309469 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb9175af-b974-4766-9c81-ffe6765a2099-secret-volume\") pod \"collect-profiles-29564325-zjpkx\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.311474 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7km6\" (UniqueName: \"kubernetes.io/projected/eb9175af-b974-4766-9c81-ffe6765a2099-kube-api-access-f7km6\") pod \"collect-profiles-29564325-zjpkx\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.502320 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:00 crc kubenswrapper[5008]: I0318 18:45:00.897315 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx"] Mar 18 18:45:01 crc kubenswrapper[5008]: I0318 18:45:01.639228 5008 generic.go:334] "Generic (PLEG): container finished" podID="eb9175af-b974-4766-9c81-ffe6765a2099" containerID="e2d0ef204a166e30edc9dd4e618f6321828bb1606f9e2a5163376d95def2ff6c" exitCode=0 Mar 18 18:45:01 crc kubenswrapper[5008]: I0318 18:45:01.639403 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" event={"ID":"eb9175af-b974-4766-9c81-ffe6765a2099","Type":"ContainerDied","Data":"e2d0ef204a166e30edc9dd4e618f6321828bb1606f9e2a5163376d95def2ff6c"} Mar 18 18:45:01 crc kubenswrapper[5008]: I0318 18:45:01.639510 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" event={"ID":"eb9175af-b974-4766-9c81-ffe6765a2099","Type":"ContainerStarted","Data":"cde26c2b5a4746ca77aa9ac133a5616b93b096846b8bd6a7344515169a4decb6"} Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.106083 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.144781 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7km6\" (UniqueName: \"kubernetes.io/projected/eb9175af-b974-4766-9c81-ffe6765a2099-kube-api-access-f7km6\") pod \"eb9175af-b974-4766-9c81-ffe6765a2099\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.144990 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb9175af-b974-4766-9c81-ffe6765a2099-secret-volume\") pod \"eb9175af-b974-4766-9c81-ffe6765a2099\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.145042 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb9175af-b974-4766-9c81-ffe6765a2099-config-volume\") pod \"eb9175af-b974-4766-9c81-ffe6765a2099\" (UID: \"eb9175af-b974-4766-9c81-ffe6765a2099\") " Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.145833 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb9175af-b974-4766-9c81-ffe6765a2099-config-volume" (OuterVolumeSpecName: "config-volume") pod "eb9175af-b974-4766-9c81-ffe6765a2099" (UID: "eb9175af-b974-4766-9c81-ffe6765a2099"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.151942 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb9175af-b974-4766-9c81-ffe6765a2099-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eb9175af-b974-4766-9c81-ffe6765a2099" (UID: "eb9175af-b974-4766-9c81-ffe6765a2099"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.152746 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9175af-b974-4766-9c81-ffe6765a2099-kube-api-access-f7km6" (OuterVolumeSpecName: "kube-api-access-f7km6") pod "eb9175af-b974-4766-9c81-ffe6765a2099" (UID: "eb9175af-b974-4766-9c81-ffe6765a2099"). InnerVolumeSpecName "kube-api-access-f7km6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.247478 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7km6\" (UniqueName: \"kubernetes.io/projected/eb9175af-b974-4766-9c81-ffe6765a2099-kube-api-access-f7km6\") on node \"crc\" DevicePath \"\"" Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.247527 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb9175af-b974-4766-9c81-ffe6765a2099-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.247548 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb9175af-b974-4766-9c81-ffe6765a2099-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.663138 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" event={"ID":"eb9175af-b974-4766-9c81-ffe6765a2099","Type":"ContainerDied","Data":"cde26c2b5a4746ca77aa9ac133a5616b93b096846b8bd6a7344515169a4decb6"} Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.663191 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cde26c2b5a4746ca77aa9ac133a5616b93b096846b8bd6a7344515169a4decb6" Mar 18 18:45:03 crc kubenswrapper[5008]: I0318 18:45:03.663222 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx" Mar 18 18:45:04 crc kubenswrapper[5008]: I0318 18:45:04.211357 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl"] Mar 18 18:45:04 crc kubenswrapper[5008]: I0318 18:45:04.217988 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564280-d46pl"] Mar 18 18:45:06 crc kubenswrapper[5008]: I0318 18:45:06.209819 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860a9876-b8f6-4125-bd1c-51518eb10283" path="/var/lib/kubelet/pods/860a9876-b8f6-4125-bd1c-51518eb10283/volumes" Mar 18 18:45:07 crc kubenswrapper[5008]: I0318 18:45:07.198805 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:45:07 crc kubenswrapper[5008]: I0318 18:45:07.700740 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"c5c837c0a7c89f4a27127e124d9788fb50066a8b2a7b9525f7a5cb470479e844"} Mar 18 18:45:41 crc kubenswrapper[5008]: I0318 18:45:41.145770 5008 scope.go:117] "RemoveContainer" containerID="a6d520fda5cf8a6f5d8092a6215019730978014336857c00e8726c3623486cd0" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.166819 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564326-l8brh"] Mar 18 18:46:00 crc kubenswrapper[5008]: E0318 18:46:00.168238 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9175af-b974-4766-9c81-ffe6765a2099" containerName="collect-profiles" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.168273 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9175af-b974-4766-9c81-ffe6765a2099" containerName="collect-profiles" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.168704 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9175af-b974-4766-9c81-ffe6765a2099" containerName="collect-profiles" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.169784 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564326-l8brh" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.173318 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.173961 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.180790 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.181495 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564326-l8brh"] Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.371583 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsss7\" (UniqueName: \"kubernetes.io/projected/675cef98-7f9a-4b71-88cb-1be0c5a34bbe-kube-api-access-tsss7\") pod \"auto-csr-approver-29564326-l8brh\" (UID: \"675cef98-7f9a-4b71-88cb-1be0c5a34bbe\") " pod="openshift-infra/auto-csr-approver-29564326-l8brh" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.473811 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsss7\" (UniqueName: \"kubernetes.io/projected/675cef98-7f9a-4b71-88cb-1be0c5a34bbe-kube-api-access-tsss7\") pod \"auto-csr-approver-29564326-l8brh\" (UID: \"675cef98-7f9a-4b71-88cb-1be0c5a34bbe\") " pod="openshift-infra/auto-csr-approver-29564326-l8brh" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.499348 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsss7\" (UniqueName: \"kubernetes.io/projected/675cef98-7f9a-4b71-88cb-1be0c5a34bbe-kube-api-access-tsss7\") pod \"auto-csr-approver-29564326-l8brh\" (UID: \"675cef98-7f9a-4b71-88cb-1be0c5a34bbe\") " pod="openshift-infra/auto-csr-approver-29564326-l8brh" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.503010 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564326-l8brh" Mar 18 18:46:00 crc kubenswrapper[5008]: I0318 18:46:00.998054 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564326-l8brh"] Mar 18 18:46:01 crc kubenswrapper[5008]: I0318 18:46:01.004889 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:46:01 crc kubenswrapper[5008]: I0318 18:46:01.194857 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564326-l8brh" event={"ID":"675cef98-7f9a-4b71-88cb-1be0c5a34bbe","Type":"ContainerStarted","Data":"34128ccf66c0be45219ecd8964fec68a725679470fa9893f7935495b27dc3f56"} Mar 18 18:46:03 crc kubenswrapper[5008]: I0318 18:46:03.215435 5008 generic.go:334] "Generic (PLEG): container finished" podID="675cef98-7f9a-4b71-88cb-1be0c5a34bbe" containerID="a3872188e8c3fba23da0dc1940a81362b146d5d214923a2229ea5a04bb36211e" exitCode=0 Mar 18 18:46:03 crc kubenswrapper[5008]: I0318 18:46:03.216474 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564326-l8brh" event={"ID":"675cef98-7f9a-4b71-88cb-1be0c5a34bbe","Type":"ContainerDied","Data":"a3872188e8c3fba23da0dc1940a81362b146d5d214923a2229ea5a04bb36211e"} Mar 18 18:46:04 crc kubenswrapper[5008]: I0318 18:46:04.588697 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564326-l8brh" Mar 18 18:46:04 crc kubenswrapper[5008]: I0318 18:46:04.738372 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsss7\" (UniqueName: \"kubernetes.io/projected/675cef98-7f9a-4b71-88cb-1be0c5a34bbe-kube-api-access-tsss7\") pod \"675cef98-7f9a-4b71-88cb-1be0c5a34bbe\" (UID: \"675cef98-7f9a-4b71-88cb-1be0c5a34bbe\") " Mar 18 18:46:04 crc kubenswrapper[5008]: I0318 18:46:04.749864 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/675cef98-7f9a-4b71-88cb-1be0c5a34bbe-kube-api-access-tsss7" (OuterVolumeSpecName: "kube-api-access-tsss7") pod "675cef98-7f9a-4b71-88cb-1be0c5a34bbe" (UID: "675cef98-7f9a-4b71-88cb-1be0c5a34bbe"). InnerVolumeSpecName "kube-api-access-tsss7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:46:04 crc kubenswrapper[5008]: I0318 18:46:04.840794 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsss7\" (UniqueName: \"kubernetes.io/projected/675cef98-7f9a-4b71-88cb-1be0c5a34bbe-kube-api-access-tsss7\") on node \"crc\" DevicePath \"\"" Mar 18 18:46:05 crc kubenswrapper[5008]: I0318 18:46:05.236674 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564326-l8brh" event={"ID":"675cef98-7f9a-4b71-88cb-1be0c5a34bbe","Type":"ContainerDied","Data":"34128ccf66c0be45219ecd8964fec68a725679470fa9893f7935495b27dc3f56"} Mar 18 18:46:05 crc kubenswrapper[5008]: I0318 18:46:05.236770 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34128ccf66c0be45219ecd8964fec68a725679470fa9893f7935495b27dc3f56" Mar 18 18:46:05 crc kubenswrapper[5008]: I0318 18:46:05.236853 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564326-l8brh" Mar 18 18:46:05 crc kubenswrapper[5008]: I0318 18:46:05.776607 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564320-c7lvf"] Mar 18 18:46:05 crc kubenswrapper[5008]: I0318 18:46:05.786272 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564320-c7lvf"] Mar 18 18:46:06 crc kubenswrapper[5008]: I0318 18:46:06.214193 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf092a2-e99a-4970-883b-7dd6b0882993" path="/var/lib/kubelet/pods/7bf092a2-e99a-4970-883b-7dd6b0882993/volumes" Mar 18 18:46:41 crc kubenswrapper[5008]: I0318 18:46:41.220690 5008 scope.go:117] "RemoveContainer" containerID="52878deb5dfbd5274232081e5606ed9bf8bfa0547776301a35e0d6475204cc61" Mar 18 18:47:24 crc kubenswrapper[5008]: I0318 18:47:24.460788 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:47:24 crc kubenswrapper[5008]: I0318 18:47:24.461425 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.456414 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7nlzx"] Mar 18 18:47:32 crc kubenswrapper[5008]: E0318 18:47:32.457265 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675cef98-7f9a-4b71-88cb-1be0c5a34bbe" containerName="oc" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.457277 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="675cef98-7f9a-4b71-88cb-1be0c5a34bbe" containerName="oc" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.457401 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="675cef98-7f9a-4b71-88cb-1be0c5a34bbe" containerName="oc" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.458372 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.486258 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7nlzx"] Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.617488 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-catalog-content\") pod \"community-operators-7nlzx\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.617621 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sjcv\" (UniqueName: \"kubernetes.io/projected/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-kube-api-access-6sjcv\") pod \"community-operators-7nlzx\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.617704 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-utilities\") pod \"community-operators-7nlzx\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.719153 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-utilities\") pod \"community-operators-7nlzx\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.719224 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-catalog-content\") pod \"community-operators-7nlzx\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.719263 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sjcv\" (UniqueName: \"kubernetes.io/projected/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-kube-api-access-6sjcv\") pod \"community-operators-7nlzx\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.720089 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-utilities\") pod \"community-operators-7nlzx\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.720184 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-catalog-content\") pod \"community-operators-7nlzx\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.747462 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sjcv\" (UniqueName: \"kubernetes.io/projected/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-kube-api-access-6sjcv\") pod \"community-operators-7nlzx\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:32 crc kubenswrapper[5008]: I0318 18:47:32.785960 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:33 crc kubenswrapper[5008]: I0318 18:47:33.248768 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7nlzx"] Mar 18 18:47:34 crc kubenswrapper[5008]: I0318 18:47:34.020345 5008 generic.go:334] "Generic (PLEG): container finished" podID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerID="fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea" exitCode=0 Mar 18 18:47:34 crc kubenswrapper[5008]: I0318 18:47:34.020432 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nlzx" event={"ID":"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250","Type":"ContainerDied","Data":"fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea"} Mar 18 18:47:34 crc kubenswrapper[5008]: I0318 18:47:34.021010 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nlzx" event={"ID":"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250","Type":"ContainerStarted","Data":"4e7ace157bd53531179f6719ea32195b22365daf6c850c17a9c1deeb8871c26b"} Mar 18 18:47:35 crc kubenswrapper[5008]: I0318 18:47:35.029989 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nlzx" event={"ID":"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250","Type":"ContainerStarted","Data":"df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8"} Mar 18 18:47:36 crc kubenswrapper[5008]: I0318 18:47:36.037207 5008 generic.go:334] "Generic (PLEG): container finished" podID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerID="df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8" exitCode=0 Mar 18 18:47:36 crc kubenswrapper[5008]: I0318 18:47:36.037324 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nlzx" event={"ID":"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250","Type":"ContainerDied","Data":"df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8"} Mar 18 18:47:37 crc kubenswrapper[5008]: I0318 18:47:37.045310 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nlzx" event={"ID":"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250","Type":"ContainerStarted","Data":"ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89"} Mar 18 18:47:37 crc kubenswrapper[5008]: I0318 18:47:37.061922 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7nlzx" podStartSLOduration=2.516809352 podStartE2EDuration="5.061903427s" podCreationTimestamp="2026-03-18 18:47:32 +0000 UTC" firstStartedPulling="2026-03-18 18:47:34.02292875 +0000 UTC m=+2710.542401839" lastFinishedPulling="2026-03-18 18:47:36.568022785 +0000 UTC m=+2713.087495914" observedRunningTime="2026-03-18 18:47:37.060964762 +0000 UTC m=+2713.580437861" watchObservedRunningTime="2026-03-18 18:47:37.061903427 +0000 UTC m=+2713.581376506" Mar 18 18:47:38 crc kubenswrapper[5008]: I0318 18:47:38.825971 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tndhn"] Mar 18 18:47:38 crc kubenswrapper[5008]: I0318 18:47:38.827979 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:38 crc kubenswrapper[5008]: I0318 18:47:38.844461 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tndhn"] Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.017523 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-catalog-content\") pod \"redhat-operators-tndhn\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.017635 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs4qv\" (UniqueName: \"kubernetes.io/projected/c1973835-2da1-4a58-bae1-7cbb8a499525-kube-api-access-gs4qv\") pod \"redhat-operators-tndhn\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.017701 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-utilities\") pod \"redhat-operators-tndhn\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.119411 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-catalog-content\") pod \"redhat-operators-tndhn\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.119466 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs4qv\" (UniqueName: \"kubernetes.io/projected/c1973835-2da1-4a58-bae1-7cbb8a499525-kube-api-access-gs4qv\") pod \"redhat-operators-tndhn\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.119504 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-utilities\") pod \"redhat-operators-tndhn\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.120027 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-catalog-content\") pod \"redhat-operators-tndhn\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.120074 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-utilities\") pod \"redhat-operators-tndhn\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.138426 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs4qv\" (UniqueName: \"kubernetes.io/projected/c1973835-2da1-4a58-bae1-7cbb8a499525-kube-api-access-gs4qv\") pod \"redhat-operators-tndhn\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.153147 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:39 crc kubenswrapper[5008]: I0318 18:47:39.373909 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tndhn"] Mar 18 18:47:40 crc kubenswrapper[5008]: I0318 18:47:40.066876 5008 generic.go:334] "Generic (PLEG): container finished" podID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerID="b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0" exitCode=0 Mar 18 18:47:40 crc kubenswrapper[5008]: I0318 18:47:40.066981 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tndhn" event={"ID":"c1973835-2da1-4a58-bae1-7cbb8a499525","Type":"ContainerDied","Data":"b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0"} Mar 18 18:47:40 crc kubenswrapper[5008]: I0318 18:47:40.067243 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tndhn" event={"ID":"c1973835-2da1-4a58-bae1-7cbb8a499525","Type":"ContainerStarted","Data":"4cf76d698e71717a0eb89adc542e03a5e7cb78fc4946ddc3931801964de68645"} Mar 18 18:47:41 crc kubenswrapper[5008]: I0318 18:47:41.113470 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tndhn" event={"ID":"c1973835-2da1-4a58-bae1-7cbb8a499525","Type":"ContainerStarted","Data":"37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291"} Mar 18 18:47:42 crc kubenswrapper[5008]: I0318 18:47:42.124652 5008 generic.go:334] "Generic (PLEG): container finished" podID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerID="37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291" exitCode=0 Mar 18 18:47:42 crc kubenswrapper[5008]: I0318 18:47:42.124739 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tndhn" event={"ID":"c1973835-2da1-4a58-bae1-7cbb8a499525","Type":"ContainerDied","Data":"37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291"} Mar 18 18:47:42 crc kubenswrapper[5008]: I0318 18:47:42.787860 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:42 crc kubenswrapper[5008]: I0318 18:47:42.788274 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:42 crc kubenswrapper[5008]: I0318 18:47:42.851025 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:43 crc kubenswrapper[5008]: I0318 18:47:43.134361 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tndhn" event={"ID":"c1973835-2da1-4a58-bae1-7cbb8a499525","Type":"ContainerStarted","Data":"35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041"} Mar 18 18:47:43 crc kubenswrapper[5008]: I0318 18:47:43.205753 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:43 crc kubenswrapper[5008]: I0318 18:47:43.239989 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tndhn" podStartSLOduration=2.729400122 podStartE2EDuration="5.239960621s" podCreationTimestamp="2026-03-18 18:47:38 +0000 UTC" firstStartedPulling="2026-03-18 18:47:40.06868201 +0000 UTC m=+2716.588155089" lastFinishedPulling="2026-03-18 18:47:42.579242499 +0000 UTC m=+2719.098715588" observedRunningTime="2026-03-18 18:47:43.155161784 +0000 UTC m=+2719.674634923" watchObservedRunningTime="2026-03-18 18:47:43.239960621 +0000 UTC m=+2719.759433740" Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.211427 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7nlzx"] Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.211956 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7nlzx" podUID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerName="registry-server" containerID="cri-o://ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89" gracePeriod=2 Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.725248 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.821911 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-catalog-content\") pod \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.822025 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sjcv\" (UniqueName: \"kubernetes.io/projected/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-kube-api-access-6sjcv\") pod \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.822086 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-utilities\") pod \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\" (UID: \"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250\") " Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.822958 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-utilities" (OuterVolumeSpecName: "utilities") pod "1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" (UID: "1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.831786 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-kube-api-access-6sjcv" (OuterVolumeSpecName: "kube-api-access-6sjcv") pod "1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" (UID: "1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250"). InnerVolumeSpecName "kube-api-access-6sjcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.872916 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" (UID: "1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.923987 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.924047 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sjcv\" (UniqueName: \"kubernetes.io/projected/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-kube-api-access-6sjcv\") on node \"crc\" DevicePath \"\"" Mar 18 18:47:45 crc kubenswrapper[5008]: I0318 18:47:45.924072 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.158802 5008 generic.go:334] "Generic (PLEG): container finished" podID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerID="ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89" exitCode=0 Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.158845 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nlzx" event={"ID":"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250","Type":"ContainerDied","Data":"ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89"} Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.158878 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nlzx" event={"ID":"1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250","Type":"ContainerDied","Data":"4e7ace157bd53531179f6719ea32195b22365daf6c850c17a9c1deeb8871c26b"} Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.158895 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nlzx" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.158901 5008 scope.go:117] "RemoveContainer" containerID="ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.183797 5008 scope.go:117] "RemoveContainer" containerID="df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.213159 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7nlzx"] Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.228367 5008 scope.go:117] "RemoveContainer" containerID="fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.232681 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7nlzx"] Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.252580 5008 scope.go:117] "RemoveContainer" containerID="ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89" Mar 18 18:47:46 crc kubenswrapper[5008]: E0318 18:47:46.253106 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89\": container with ID starting with ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89 not found: ID does not exist" containerID="ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.253156 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89"} err="failed to get container status \"ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89\": rpc error: code = NotFound desc = could not find container \"ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89\": container with ID starting with ec125d8149a19ed888f8a533e09ab27e53d0d6899464acb1763a96cdc9251b89 not found: ID does not exist" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.253191 5008 scope.go:117] "RemoveContainer" containerID="df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8" Mar 18 18:47:46 crc kubenswrapper[5008]: E0318 18:47:46.253918 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8\": container with ID starting with df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8 not found: ID does not exist" containerID="df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.254072 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8"} err="failed to get container status \"df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8\": rpc error: code = NotFound desc = could not find container \"df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8\": container with ID starting with df4ada66f476572f2253007c2512d7260d9676a56127f2a2869aead0b77d48c8 not found: ID does not exist" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.254190 5008 scope.go:117] "RemoveContainer" containerID="fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea" Mar 18 18:47:46 crc kubenswrapper[5008]: E0318 18:47:46.254621 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea\": container with ID starting with fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea not found: ID does not exist" containerID="fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea" Mar 18 18:47:46 crc kubenswrapper[5008]: I0318 18:47:46.254773 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea"} err="failed to get container status \"fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea\": rpc error: code = NotFound desc = could not find container \"fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea\": container with ID starting with fcbc457ae6927ca40e60ac47229d8f7bfbfcebd0a5655098d7b7ac939749cbea not found: ID does not exist" Mar 18 18:47:48 crc kubenswrapper[5008]: I0318 18:47:48.208665 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" path="/var/lib/kubelet/pods/1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250/volumes" Mar 18 18:47:49 crc kubenswrapper[5008]: I0318 18:47:49.154418 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:49 crc kubenswrapper[5008]: I0318 18:47:49.154796 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:50 crc kubenswrapper[5008]: I0318 18:47:50.198537 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tndhn" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerName="registry-server" probeResult="failure" output=< Mar 18 18:47:50 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 18:47:50 crc kubenswrapper[5008]: > Mar 18 18:47:54 crc kubenswrapper[5008]: I0318 18:47:54.460413 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:47:54 crc kubenswrapper[5008]: I0318 18:47:54.461173 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:47:59 crc kubenswrapper[5008]: I0318 18:47:59.230128 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:59 crc kubenswrapper[5008]: I0318 18:47:59.297023 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:47:59 crc kubenswrapper[5008]: I0318 18:47:59.488080 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tndhn"] Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.159810 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564328-djpsv"] Mar 18 18:48:00 crc kubenswrapper[5008]: E0318 18:48:00.160415 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerName="registry-server" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.160450 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerName="registry-server" Mar 18 18:48:00 crc kubenswrapper[5008]: E0318 18:48:00.160488 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerName="extract-content" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.160506 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerName="extract-content" Mar 18 18:48:00 crc kubenswrapper[5008]: E0318 18:48:00.160591 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerName="extract-utilities" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.160613 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerName="extract-utilities" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.160966 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a444ed7-ffcb-43f2-9fe5-b59a7c4c2250" containerName="registry-server" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.161992 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564328-djpsv" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.164159 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.165450 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.166637 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.173866 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564328-djpsv"] Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.276343 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tndhn" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerName="registry-server" containerID="cri-o://35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041" gracePeriod=2 Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.280961 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2qmd\" (UniqueName: \"kubernetes.io/projected/ccb505c7-9e8c-4a09-9187-fb14bc31a590-kube-api-access-g2qmd\") pod \"auto-csr-approver-29564328-djpsv\" (UID: \"ccb505c7-9e8c-4a09-9187-fb14bc31a590\") " pod="openshift-infra/auto-csr-approver-29564328-djpsv" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.383948 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2qmd\" (UniqueName: \"kubernetes.io/projected/ccb505c7-9e8c-4a09-9187-fb14bc31a590-kube-api-access-g2qmd\") pod \"auto-csr-approver-29564328-djpsv\" (UID: \"ccb505c7-9e8c-4a09-9187-fb14bc31a590\") " pod="openshift-infra/auto-csr-approver-29564328-djpsv" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.404259 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2qmd\" (UniqueName: \"kubernetes.io/projected/ccb505c7-9e8c-4a09-9187-fb14bc31a590-kube-api-access-g2qmd\") pod \"auto-csr-approver-29564328-djpsv\" (UID: \"ccb505c7-9e8c-4a09-9187-fb14bc31a590\") " pod="openshift-infra/auto-csr-approver-29564328-djpsv" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.513286 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564328-djpsv" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.688678 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.819541 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs4qv\" (UniqueName: \"kubernetes.io/projected/c1973835-2da1-4a58-bae1-7cbb8a499525-kube-api-access-gs4qv\") pod \"c1973835-2da1-4a58-bae1-7cbb8a499525\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.820035 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-utilities\") pod \"c1973835-2da1-4a58-bae1-7cbb8a499525\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.820076 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-catalog-content\") pod \"c1973835-2da1-4a58-bae1-7cbb8a499525\" (UID: \"c1973835-2da1-4a58-bae1-7cbb8a499525\") " Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.821107 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-utilities" (OuterVolumeSpecName: "utilities") pod "c1973835-2da1-4a58-bae1-7cbb8a499525" (UID: "c1973835-2da1-4a58-bae1-7cbb8a499525"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.823813 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1973835-2da1-4a58-bae1-7cbb8a499525-kube-api-access-gs4qv" (OuterVolumeSpecName: "kube-api-access-gs4qv") pod "c1973835-2da1-4a58-bae1-7cbb8a499525" (UID: "c1973835-2da1-4a58-bae1-7cbb8a499525"). InnerVolumeSpecName "kube-api-access-gs4qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.923632 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs4qv\" (UniqueName: \"kubernetes.io/projected/c1973835-2da1-4a58-bae1-7cbb8a499525-kube-api-access-gs4qv\") on node \"crc\" DevicePath \"\"" Mar 18 18:48:00 crc kubenswrapper[5008]: I0318 18:48:00.923676 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.018849 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564328-djpsv"] Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.049272 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1973835-2da1-4a58-bae1-7cbb8a499525" (UID: "c1973835-2da1-4a58-bae1-7cbb8a499525"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.126029 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1973835-2da1-4a58-bae1-7cbb8a499525-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.287260 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564328-djpsv" event={"ID":"ccb505c7-9e8c-4a09-9187-fb14bc31a590","Type":"ContainerStarted","Data":"25ce61c1405fe19f85420ddf5012d05d5070cd9dc61fd4ea6d84ebec43fc46b5"} Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.292270 5008 generic.go:334] "Generic (PLEG): container finished" podID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerID="35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041" exitCode=0 Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.292333 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tndhn" event={"ID":"c1973835-2da1-4a58-bae1-7cbb8a499525","Type":"ContainerDied","Data":"35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041"} Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.292376 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tndhn" event={"ID":"c1973835-2da1-4a58-bae1-7cbb8a499525","Type":"ContainerDied","Data":"4cf76d698e71717a0eb89adc542e03a5e7cb78fc4946ddc3931801964de68645"} Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.292411 5008 scope.go:117] "RemoveContainer" containerID="35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.292638 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tndhn" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.321426 5008 scope.go:117] "RemoveContainer" containerID="37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.367036 5008 scope.go:117] "RemoveContainer" containerID="b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.368857 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tndhn"] Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.376205 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tndhn"] Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.387671 5008 scope.go:117] "RemoveContainer" containerID="35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041" Mar 18 18:48:01 crc kubenswrapper[5008]: E0318 18:48:01.388617 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041\": container with ID starting with 35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041 not found: ID does not exist" containerID="35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.388656 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041"} err="failed to get container status \"35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041\": rpc error: code = NotFound desc = could not find container \"35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041\": container with ID starting with 35f5a7d21bcb5dce02ac25d04321c145358c69436e60c2e41f1a31f1c4d60041 not found: ID does not exist" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.388681 5008 scope.go:117] "RemoveContainer" containerID="37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291" Mar 18 18:48:01 crc kubenswrapper[5008]: E0318 18:48:01.389116 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291\": container with ID starting with 37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291 not found: ID does not exist" containerID="37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.389174 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291"} err="failed to get container status \"37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291\": rpc error: code = NotFound desc = could not find container \"37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291\": container with ID starting with 37cb7863df96b9d3e970f5e374d87db99d2c3335ae946a6b45e1ae33969ea291 not found: ID does not exist" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.389212 5008 scope.go:117] "RemoveContainer" containerID="b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0" Mar 18 18:48:01 crc kubenswrapper[5008]: E0318 18:48:01.389687 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0\": container with ID starting with b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0 not found: ID does not exist" containerID="b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0" Mar 18 18:48:01 crc kubenswrapper[5008]: I0318 18:48:01.389726 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0"} err="failed to get container status \"b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0\": rpc error: code = NotFound desc = could not find container \"b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0\": container with ID starting with b301c2341fc0033d67dc7526ce64e9e9d1be9eeb02df231d168ed0ad0a1badb0 not found: ID does not exist" Mar 18 18:48:02 crc kubenswrapper[5008]: I0318 18:48:02.214169 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" path="/var/lib/kubelet/pods/c1973835-2da1-4a58-bae1-7cbb8a499525/volumes" Mar 18 18:48:03 crc kubenswrapper[5008]: I0318 18:48:03.313095 5008 generic.go:334] "Generic (PLEG): container finished" podID="ccb505c7-9e8c-4a09-9187-fb14bc31a590" containerID="b1f32e6cb0c79b069393df61652e98a85c9cc3a4363db8b0aac3ab81e8794450" exitCode=0 Mar 18 18:48:03 crc kubenswrapper[5008]: I0318 18:48:03.313158 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564328-djpsv" event={"ID":"ccb505c7-9e8c-4a09-9187-fb14bc31a590","Type":"ContainerDied","Data":"b1f32e6cb0c79b069393df61652e98a85c9cc3a4363db8b0aac3ab81e8794450"} Mar 18 18:48:04 crc kubenswrapper[5008]: I0318 18:48:04.713732 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564328-djpsv" Mar 18 18:48:04 crc kubenswrapper[5008]: I0318 18:48:04.885668 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2qmd\" (UniqueName: \"kubernetes.io/projected/ccb505c7-9e8c-4a09-9187-fb14bc31a590-kube-api-access-g2qmd\") pod \"ccb505c7-9e8c-4a09-9187-fb14bc31a590\" (UID: \"ccb505c7-9e8c-4a09-9187-fb14bc31a590\") " Mar 18 18:48:04 crc kubenswrapper[5008]: I0318 18:48:04.893785 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb505c7-9e8c-4a09-9187-fb14bc31a590-kube-api-access-g2qmd" (OuterVolumeSpecName: "kube-api-access-g2qmd") pod "ccb505c7-9e8c-4a09-9187-fb14bc31a590" (UID: "ccb505c7-9e8c-4a09-9187-fb14bc31a590"). InnerVolumeSpecName "kube-api-access-g2qmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:48:04 crc kubenswrapper[5008]: I0318 18:48:04.988006 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2qmd\" (UniqueName: \"kubernetes.io/projected/ccb505c7-9e8c-4a09-9187-fb14bc31a590-kube-api-access-g2qmd\") on node \"crc\" DevicePath \"\"" Mar 18 18:48:05 crc kubenswrapper[5008]: I0318 18:48:05.343172 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564328-djpsv" event={"ID":"ccb505c7-9e8c-4a09-9187-fb14bc31a590","Type":"ContainerDied","Data":"25ce61c1405fe19f85420ddf5012d05d5070cd9dc61fd4ea6d84ebec43fc46b5"} Mar 18 18:48:05 crc kubenswrapper[5008]: I0318 18:48:05.343211 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564328-djpsv" Mar 18 18:48:05 crc kubenswrapper[5008]: I0318 18:48:05.343215 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25ce61c1405fe19f85420ddf5012d05d5070cd9dc61fd4ea6d84ebec43fc46b5" Mar 18 18:48:05 crc kubenswrapper[5008]: I0318 18:48:05.795818 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564322-dlfpn"] Mar 18 18:48:05 crc kubenswrapper[5008]: I0318 18:48:05.802483 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564322-dlfpn"] Mar 18 18:48:06 crc kubenswrapper[5008]: I0318 18:48:06.208801 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e29c6b1-9049-47ea-97a6-c1f00f42ebad" path="/var/lib/kubelet/pods/6e29c6b1-9049-47ea-97a6-c1f00f42ebad/volumes" Mar 18 18:48:24 crc kubenswrapper[5008]: I0318 18:48:24.460211 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:48:24 crc kubenswrapper[5008]: I0318 18:48:24.460791 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:48:24 crc kubenswrapper[5008]: I0318 18:48:24.460902 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:48:24 crc kubenswrapper[5008]: I0318 18:48:24.461808 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5c837c0a7c89f4a27127e124d9788fb50066a8b2a7b9525f7a5cb470479e844"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:48:24 crc kubenswrapper[5008]: I0318 18:48:24.461901 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://c5c837c0a7c89f4a27127e124d9788fb50066a8b2a7b9525f7a5cb470479e844" gracePeriod=600 Mar 18 18:48:25 crc kubenswrapper[5008]: I0318 18:48:25.540905 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="c5c837c0a7c89f4a27127e124d9788fb50066a8b2a7b9525f7a5cb470479e844" exitCode=0 Mar 18 18:48:25 crc kubenswrapper[5008]: I0318 18:48:25.541030 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"c5c837c0a7c89f4a27127e124d9788fb50066a8b2a7b9525f7a5cb470479e844"} Mar 18 18:48:25 crc kubenswrapper[5008]: I0318 18:48:25.541552 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab"} Mar 18 18:48:25 crc kubenswrapper[5008]: I0318 18:48:25.541638 5008 scope.go:117] "RemoveContainer" containerID="69bcd8b205a2d9073c6f35a5f0c575c80274a608551ab599bd385778340ef06c" Mar 18 18:48:41 crc kubenswrapper[5008]: I0318 18:48:41.306869 5008 scope.go:117] "RemoveContainer" containerID="f34960220ec0a7d60fe64f84aaa86216309edcf97ff42bebb28c2b5d506267dc" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.170334 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564330-s9lqc"] Mar 18 18:50:00 crc kubenswrapper[5008]: E0318 18:50:00.171468 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb505c7-9e8c-4a09-9187-fb14bc31a590" containerName="oc" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.171491 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb505c7-9e8c-4a09-9187-fb14bc31a590" containerName="oc" Mar 18 18:50:00 crc kubenswrapper[5008]: E0318 18:50:00.171517 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerName="registry-server" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.171525 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerName="registry-server" Mar 18 18:50:00 crc kubenswrapper[5008]: E0318 18:50:00.171543 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerName="extract-utilities" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.171579 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerName="extract-utilities" Mar 18 18:50:00 crc kubenswrapper[5008]: E0318 18:50:00.171602 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerName="extract-content" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.171613 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerName="extract-content" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.171793 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb505c7-9e8c-4a09-9187-fb14bc31a590" containerName="oc" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.171830 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1973835-2da1-4a58-bae1-7cbb8a499525" containerName="registry-server" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.172459 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564330-s9lqc" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.175610 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.175615 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.175666 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.182939 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564330-s9lqc"] Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.339143 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvjl\" (UniqueName: \"kubernetes.io/projected/ef74e457-30de-42f8-99f1-fcf9bc99e004-kube-api-access-qqvjl\") pod \"auto-csr-approver-29564330-s9lqc\" (UID: \"ef74e457-30de-42f8-99f1-fcf9bc99e004\") " pod="openshift-infra/auto-csr-approver-29564330-s9lqc" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.441109 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvjl\" (UniqueName: \"kubernetes.io/projected/ef74e457-30de-42f8-99f1-fcf9bc99e004-kube-api-access-qqvjl\") pod \"auto-csr-approver-29564330-s9lqc\" (UID: \"ef74e457-30de-42f8-99f1-fcf9bc99e004\") " pod="openshift-infra/auto-csr-approver-29564330-s9lqc" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.473032 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvjl\" (UniqueName: \"kubernetes.io/projected/ef74e457-30de-42f8-99f1-fcf9bc99e004-kube-api-access-qqvjl\") pod \"auto-csr-approver-29564330-s9lqc\" (UID: \"ef74e457-30de-42f8-99f1-fcf9bc99e004\") " pod="openshift-infra/auto-csr-approver-29564330-s9lqc" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.489040 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564330-s9lqc" Mar 18 18:50:00 crc kubenswrapper[5008]: I0318 18:50:00.965084 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564330-s9lqc"] Mar 18 18:50:01 crc kubenswrapper[5008]: I0318 18:50:01.411905 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564330-s9lqc" event={"ID":"ef74e457-30de-42f8-99f1-fcf9bc99e004","Type":"ContainerStarted","Data":"0e360f1f89292adcfa272b8472235f31c4f2750f56c8911c17afee42c27f5ae5"} Mar 18 18:50:03 crc kubenswrapper[5008]: I0318 18:50:03.433611 5008 generic.go:334] "Generic (PLEG): container finished" podID="ef74e457-30de-42f8-99f1-fcf9bc99e004" containerID="3f55aa2d0af21bf3bad2e036344e8f8d7af9e9adce0fe493e391c1b267644224" exitCode=0 Mar 18 18:50:03 crc kubenswrapper[5008]: I0318 18:50:03.433712 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564330-s9lqc" event={"ID":"ef74e457-30de-42f8-99f1-fcf9bc99e004","Type":"ContainerDied","Data":"3f55aa2d0af21bf3bad2e036344e8f8d7af9e9adce0fe493e391c1b267644224"} Mar 18 18:50:04 crc kubenswrapper[5008]: I0318 18:50:04.878359 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564330-s9lqc" Mar 18 18:50:05 crc kubenswrapper[5008]: I0318 18:50:05.013116 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqvjl\" (UniqueName: \"kubernetes.io/projected/ef74e457-30de-42f8-99f1-fcf9bc99e004-kube-api-access-qqvjl\") pod \"ef74e457-30de-42f8-99f1-fcf9bc99e004\" (UID: \"ef74e457-30de-42f8-99f1-fcf9bc99e004\") " Mar 18 18:50:05 crc kubenswrapper[5008]: I0318 18:50:05.021094 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef74e457-30de-42f8-99f1-fcf9bc99e004-kube-api-access-qqvjl" (OuterVolumeSpecName: "kube-api-access-qqvjl") pod "ef74e457-30de-42f8-99f1-fcf9bc99e004" (UID: "ef74e457-30de-42f8-99f1-fcf9bc99e004"). InnerVolumeSpecName "kube-api-access-qqvjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:50:05 crc kubenswrapper[5008]: I0318 18:50:05.114983 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqvjl\" (UniqueName: \"kubernetes.io/projected/ef74e457-30de-42f8-99f1-fcf9bc99e004-kube-api-access-qqvjl\") on node \"crc\" DevicePath \"\"" Mar 18 18:50:05 crc kubenswrapper[5008]: I0318 18:50:05.508102 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564330-s9lqc" event={"ID":"ef74e457-30de-42f8-99f1-fcf9bc99e004","Type":"ContainerDied","Data":"0e360f1f89292adcfa272b8472235f31c4f2750f56c8911c17afee42c27f5ae5"} Mar 18 18:50:05 crc kubenswrapper[5008]: I0318 18:50:05.508150 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e360f1f89292adcfa272b8472235f31c4f2750f56c8911c17afee42c27f5ae5" Mar 18 18:50:05 crc kubenswrapper[5008]: I0318 18:50:05.508181 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564330-s9lqc" Mar 18 18:50:05 crc kubenswrapper[5008]: I0318 18:50:05.963991 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564324-9hf9w"] Mar 18 18:50:05 crc kubenswrapper[5008]: I0318 18:50:05.971335 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564324-9hf9w"] Mar 18 18:50:06 crc kubenswrapper[5008]: I0318 18:50:06.237398 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b7503e3-be08-46e5-9651-5acfe48f9cb1" path="/var/lib/kubelet/pods/6b7503e3-be08-46e5-9651-5acfe48f9cb1/volumes" Mar 18 18:50:24 crc kubenswrapper[5008]: I0318 18:50:24.460517 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:50:24 crc kubenswrapper[5008]: I0318 18:50:24.461030 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:50:41 crc kubenswrapper[5008]: I0318 18:50:41.418013 5008 scope.go:117] "RemoveContainer" containerID="66f7e0158677c710b9ba8f2a71888418d2926ea46dff65484195eefb67dc7626" Mar 18 18:50:54 crc kubenswrapper[5008]: I0318 18:50:54.460175 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:50:54 crc kubenswrapper[5008]: I0318 18:50:54.460988 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:51:24 crc kubenswrapper[5008]: I0318 18:51:24.460022 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:51:24 crc kubenswrapper[5008]: I0318 18:51:24.460678 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:51:24 crc kubenswrapper[5008]: I0318 18:51:24.460736 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:51:24 crc kubenswrapper[5008]: I0318 18:51:24.461430 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:51:24 crc kubenswrapper[5008]: I0318 18:51:24.461532 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" gracePeriod=600 Mar 18 18:51:24 crc kubenswrapper[5008]: E0318 18:51:24.594109 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:51:25 crc kubenswrapper[5008]: I0318 18:51:25.229108 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" exitCode=0 Mar 18 18:51:25 crc kubenswrapper[5008]: I0318 18:51:25.229190 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab"} Mar 18 18:51:25 crc kubenswrapper[5008]: I0318 18:51:25.229541 5008 scope.go:117] "RemoveContainer" containerID="c5c837c0a7c89f4a27127e124d9788fb50066a8b2a7b9525f7a5cb470479e844" Mar 18 18:51:25 crc kubenswrapper[5008]: I0318 18:51:25.230082 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:51:25 crc kubenswrapper[5008]: E0318 18:51:25.230447 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:51:40 crc kubenswrapper[5008]: I0318 18:51:40.198323 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:51:40 crc kubenswrapper[5008]: E0318 18:51:40.199089 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:51:52 crc kubenswrapper[5008]: I0318 18:51:52.198479 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:51:52 crc kubenswrapper[5008]: E0318 18:51:52.199939 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.162351 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564332-86fnq"] Mar 18 18:52:00 crc kubenswrapper[5008]: E0318 18:52:00.163527 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef74e457-30de-42f8-99f1-fcf9bc99e004" containerName="oc" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.163551 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef74e457-30de-42f8-99f1-fcf9bc99e004" containerName="oc" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.164043 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef74e457-30de-42f8-99f1-fcf9bc99e004" containerName="oc" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.164979 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564332-86fnq" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.167648 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.168359 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.168685 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.193109 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564332-86fnq"] Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.296516 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpfxw\" (UniqueName: \"kubernetes.io/projected/71aaed7d-973b-464e-83d7-cecddfe76eaf-kube-api-access-xpfxw\") pod \"auto-csr-approver-29564332-86fnq\" (UID: \"71aaed7d-973b-464e-83d7-cecddfe76eaf\") " pod="openshift-infra/auto-csr-approver-29564332-86fnq" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.398040 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpfxw\" (UniqueName: \"kubernetes.io/projected/71aaed7d-973b-464e-83d7-cecddfe76eaf-kube-api-access-xpfxw\") pod \"auto-csr-approver-29564332-86fnq\" (UID: \"71aaed7d-973b-464e-83d7-cecddfe76eaf\") " pod="openshift-infra/auto-csr-approver-29564332-86fnq" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.420870 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpfxw\" (UniqueName: \"kubernetes.io/projected/71aaed7d-973b-464e-83d7-cecddfe76eaf-kube-api-access-xpfxw\") pod \"auto-csr-approver-29564332-86fnq\" (UID: \"71aaed7d-973b-464e-83d7-cecddfe76eaf\") " pod="openshift-infra/auto-csr-approver-29564332-86fnq" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.488074 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564332-86fnq" Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.934144 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564332-86fnq"] Mar 18 18:52:00 crc kubenswrapper[5008]: I0318 18:52:00.947409 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:52:01 crc kubenswrapper[5008]: I0318 18:52:01.556788 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564332-86fnq" event={"ID":"71aaed7d-973b-464e-83d7-cecddfe76eaf","Type":"ContainerStarted","Data":"0b60ba95c3e5d681e1d927dca998e6d4ee1d3590919c793a573babfc7154506b"} Mar 18 18:52:02 crc kubenswrapper[5008]: I0318 18:52:02.566410 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564332-86fnq" event={"ID":"71aaed7d-973b-464e-83d7-cecddfe76eaf","Type":"ContainerStarted","Data":"1f6e4b9e8897b15cbb90d82ee426883d2362ae29fc3883fb53f4aa6ed5610783"} Mar 18 18:52:02 crc kubenswrapper[5008]: I0318 18:52:02.602744 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564332-86fnq" podStartSLOduration=1.416425053 podStartE2EDuration="2.602725359s" podCreationTimestamp="2026-03-18 18:52:00 +0000 UTC" firstStartedPulling="2026-03-18 18:52:00.947157947 +0000 UTC m=+2977.466631036" lastFinishedPulling="2026-03-18 18:52:02.133458263 +0000 UTC m=+2978.652931342" observedRunningTime="2026-03-18 18:52:02.601794005 +0000 UTC m=+2979.121267194" watchObservedRunningTime="2026-03-18 18:52:02.602725359 +0000 UTC m=+2979.122198438" Mar 18 18:52:03 crc kubenswrapper[5008]: I0318 18:52:03.585341 5008 generic.go:334] "Generic (PLEG): container finished" podID="71aaed7d-973b-464e-83d7-cecddfe76eaf" containerID="1f6e4b9e8897b15cbb90d82ee426883d2362ae29fc3883fb53f4aa6ed5610783" exitCode=0 Mar 18 18:52:03 crc kubenswrapper[5008]: I0318 18:52:03.585524 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564332-86fnq" event={"ID":"71aaed7d-973b-464e-83d7-cecddfe76eaf","Type":"ContainerDied","Data":"1f6e4b9e8897b15cbb90d82ee426883d2362ae29fc3883fb53f4aa6ed5610783"} Mar 18 18:52:04 crc kubenswrapper[5008]: I0318 18:52:04.886790 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564332-86fnq" Mar 18 18:52:04 crc kubenswrapper[5008]: I0318 18:52:04.975579 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpfxw\" (UniqueName: \"kubernetes.io/projected/71aaed7d-973b-464e-83d7-cecddfe76eaf-kube-api-access-xpfxw\") pod \"71aaed7d-973b-464e-83d7-cecddfe76eaf\" (UID: \"71aaed7d-973b-464e-83d7-cecddfe76eaf\") " Mar 18 18:52:04 crc kubenswrapper[5008]: I0318 18:52:04.981636 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71aaed7d-973b-464e-83d7-cecddfe76eaf-kube-api-access-xpfxw" (OuterVolumeSpecName: "kube-api-access-xpfxw") pod "71aaed7d-973b-464e-83d7-cecddfe76eaf" (UID: "71aaed7d-973b-464e-83d7-cecddfe76eaf"). InnerVolumeSpecName "kube-api-access-xpfxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:52:05 crc kubenswrapper[5008]: I0318 18:52:05.078028 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpfxw\" (UniqueName: \"kubernetes.io/projected/71aaed7d-973b-464e-83d7-cecddfe76eaf-kube-api-access-xpfxw\") on node \"crc\" DevicePath \"\"" Mar 18 18:52:05 crc kubenswrapper[5008]: I0318 18:52:05.605748 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564332-86fnq" event={"ID":"71aaed7d-973b-464e-83d7-cecddfe76eaf","Type":"ContainerDied","Data":"0b60ba95c3e5d681e1d927dca998e6d4ee1d3590919c793a573babfc7154506b"} Mar 18 18:52:05 crc kubenswrapper[5008]: I0318 18:52:05.605793 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b60ba95c3e5d681e1d927dca998e6d4ee1d3590919c793a573babfc7154506b" Mar 18 18:52:05 crc kubenswrapper[5008]: I0318 18:52:05.605791 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564332-86fnq" Mar 18 18:52:05 crc kubenswrapper[5008]: I0318 18:52:05.664712 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564326-l8brh"] Mar 18 18:52:05 crc kubenswrapper[5008]: I0318 18:52:05.668968 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564326-l8brh"] Mar 18 18:52:06 crc kubenswrapper[5008]: I0318 18:52:06.216648 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="675cef98-7f9a-4b71-88cb-1be0c5a34bbe" path="/var/lib/kubelet/pods/675cef98-7f9a-4b71-88cb-1be0c5a34bbe/volumes" Mar 18 18:52:07 crc kubenswrapper[5008]: I0318 18:52:07.198273 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:52:07 crc kubenswrapper[5008]: E0318 18:52:07.198689 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:52:18 crc kubenswrapper[5008]: I0318 18:52:18.198069 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:52:18 crc kubenswrapper[5008]: E0318 18:52:18.198801 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:52:32 crc kubenswrapper[5008]: I0318 18:52:32.198903 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:52:32 crc kubenswrapper[5008]: E0318 18:52:32.199858 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:52:41 crc kubenswrapper[5008]: I0318 18:52:41.529719 5008 scope.go:117] "RemoveContainer" containerID="a3872188e8c3fba23da0dc1940a81362b146d5d214923a2229ea5a04bb36211e" Mar 18 18:52:45 crc kubenswrapper[5008]: I0318 18:52:45.199492 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:52:45 crc kubenswrapper[5008]: E0318 18:52:45.200374 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.594062 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jc8q6"] Mar 18 18:52:48 crc kubenswrapper[5008]: E0318 18:52:48.594747 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71aaed7d-973b-464e-83d7-cecddfe76eaf" containerName="oc" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.594764 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="71aaed7d-973b-464e-83d7-cecddfe76eaf" containerName="oc" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.594935 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="71aaed7d-973b-464e-83d7-cecddfe76eaf" containerName="oc" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.595906 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.614447 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jc8q6"] Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.638847 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-utilities\") pod \"certified-operators-jc8q6\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.638910 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-catalog-content\") pod \"certified-operators-jc8q6\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.639006 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsz22\" (UniqueName: \"kubernetes.io/projected/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-kube-api-access-hsz22\") pod \"certified-operators-jc8q6\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.740577 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-catalog-content\") pod \"certified-operators-jc8q6\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.740679 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsz22\" (UniqueName: \"kubernetes.io/projected/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-kube-api-access-hsz22\") pod \"certified-operators-jc8q6\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.740720 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-utilities\") pod \"certified-operators-jc8q6\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.741274 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-catalog-content\") pod \"certified-operators-jc8q6\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.741303 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-utilities\") pod \"certified-operators-jc8q6\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.791237 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsz22\" (UniqueName: \"kubernetes.io/projected/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-kube-api-access-hsz22\") pod \"certified-operators-jc8q6\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:48 crc kubenswrapper[5008]: I0318 18:52:48.936353 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:49 crc kubenswrapper[5008]: I0318 18:52:49.415504 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jc8q6"] Mar 18 18:52:49 crc kubenswrapper[5008]: I0318 18:52:49.978742 5008 generic.go:334] "Generic (PLEG): container finished" podID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerID="195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03" exitCode=0 Mar 18 18:52:49 crc kubenswrapper[5008]: I0318 18:52:49.978835 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jc8q6" event={"ID":"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0","Type":"ContainerDied","Data":"195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03"} Mar 18 18:52:49 crc kubenswrapper[5008]: I0318 18:52:49.979773 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jc8q6" event={"ID":"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0","Type":"ContainerStarted","Data":"30f781a853d4a8494668705da9032bc0f3ef239657fe7201cffcb15a142bb4f8"} Mar 18 18:52:50 crc kubenswrapper[5008]: I0318 18:52:50.993633 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jc8q6" event={"ID":"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0","Type":"ContainerStarted","Data":"81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00"} Mar 18 18:52:52 crc kubenswrapper[5008]: I0318 18:52:52.007482 5008 generic.go:334] "Generic (PLEG): container finished" podID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerID="81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00" exitCode=0 Mar 18 18:52:52 crc kubenswrapper[5008]: I0318 18:52:52.007525 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jc8q6" event={"ID":"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0","Type":"ContainerDied","Data":"81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00"} Mar 18 18:52:53 crc kubenswrapper[5008]: I0318 18:52:53.021300 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jc8q6" event={"ID":"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0","Type":"ContainerStarted","Data":"da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865"} Mar 18 18:52:57 crc kubenswrapper[5008]: I0318 18:52:57.198743 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:52:57 crc kubenswrapper[5008]: E0318 18:52:57.200731 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:52:58 crc kubenswrapper[5008]: I0318 18:52:58.937255 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:58 crc kubenswrapper[5008]: I0318 18:52:58.937688 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:59 crc kubenswrapper[5008]: I0318 18:52:59.002180 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:59 crc kubenswrapper[5008]: I0318 18:52:59.032518 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jc8q6" podStartSLOduration=8.4741805 podStartE2EDuration="11.032494261s" podCreationTimestamp="2026-03-18 18:52:48 +0000 UTC" firstStartedPulling="2026-03-18 18:52:49.982009368 +0000 UTC m=+3026.501482487" lastFinishedPulling="2026-03-18 18:52:52.540323139 +0000 UTC m=+3029.059796248" observedRunningTime="2026-03-18 18:52:53.056799619 +0000 UTC m=+3029.576272728" watchObservedRunningTime="2026-03-18 18:52:59.032494261 +0000 UTC m=+3035.551967360" Mar 18 18:52:59 crc kubenswrapper[5008]: I0318 18:52:59.135336 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:52:59 crc kubenswrapper[5008]: I0318 18:52:59.244665 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jc8q6"] Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.086228 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jc8q6" podUID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerName="registry-server" containerID="cri-o://da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865" gracePeriod=2 Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.560152 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.664746 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-catalog-content\") pod \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.664827 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsz22\" (UniqueName: \"kubernetes.io/projected/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-kube-api-access-hsz22\") pod \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.664914 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-utilities\") pod \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\" (UID: \"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0\") " Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.665987 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-utilities" (OuterVolumeSpecName: "utilities") pod "0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" (UID: "0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.671264 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-kube-api-access-hsz22" (OuterVolumeSpecName: "kube-api-access-hsz22") pod "0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" (UID: "0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0"). InnerVolumeSpecName "kube-api-access-hsz22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.748294 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" (UID: "0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.769595 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.769631 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsz22\" (UniqueName: \"kubernetes.io/projected/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-kube-api-access-hsz22\") on node \"crc\" DevicePath \"\"" Mar 18 18:53:01 crc kubenswrapper[5008]: I0318 18:53:01.769643 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.094799 5008 generic.go:334] "Generic (PLEG): container finished" podID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerID="da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865" exitCode=0 Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.094901 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jc8q6" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.094906 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jc8q6" event={"ID":"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0","Type":"ContainerDied","Data":"da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865"} Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.095629 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jc8q6" event={"ID":"0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0","Type":"ContainerDied","Data":"30f781a853d4a8494668705da9032bc0f3ef239657fe7201cffcb15a142bb4f8"} Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.095652 5008 scope.go:117] "RemoveContainer" containerID="da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.115345 5008 scope.go:117] "RemoveContainer" containerID="81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.137672 5008 scope.go:117] "RemoveContainer" containerID="195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.150678 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jc8q6"] Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.156228 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jc8q6"] Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.163517 5008 scope.go:117] "RemoveContainer" containerID="da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865" Mar 18 18:53:02 crc kubenswrapper[5008]: E0318 18:53:02.164118 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865\": container with ID starting with da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865 not found: ID does not exist" containerID="da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.164180 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865"} err="failed to get container status \"da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865\": rpc error: code = NotFound desc = could not find container \"da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865\": container with ID starting with da8f699df1b1389869ec1065b0f23b8c78f44e209d0d34c75e562a744b52c865 not found: ID does not exist" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.164222 5008 scope.go:117] "RemoveContainer" containerID="81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00" Mar 18 18:53:02 crc kubenswrapper[5008]: E0318 18:53:02.164812 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00\": container with ID starting with 81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00 not found: ID does not exist" containerID="81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.164866 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00"} err="failed to get container status \"81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00\": rpc error: code = NotFound desc = could not find container \"81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00\": container with ID starting with 81a3f2292597cd78a5fc9a38bef7be7587dac1c50612871fc0e6ed2a9b564a00 not found: ID does not exist" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.164897 5008 scope.go:117] "RemoveContainer" containerID="195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03" Mar 18 18:53:02 crc kubenswrapper[5008]: E0318 18:53:02.165232 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03\": container with ID starting with 195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03 not found: ID does not exist" containerID="195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.165263 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03"} err="failed to get container status \"195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03\": rpc error: code = NotFound desc = could not find container \"195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03\": container with ID starting with 195adff8fbda1cd09cd59002cdc124352772c80d6f320f051e8190f2dac91f03 not found: ID does not exist" Mar 18 18:53:02 crc kubenswrapper[5008]: I0318 18:53:02.210757 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" path="/var/lib/kubelet/pods/0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0/volumes" Mar 18 18:53:08 crc kubenswrapper[5008]: I0318 18:53:08.198665 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:53:08 crc kubenswrapper[5008]: E0318 18:53:08.199845 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:53:20 crc kubenswrapper[5008]: I0318 18:53:20.199021 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:53:20 crc kubenswrapper[5008]: E0318 18:53:20.199604 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:53:32 crc kubenswrapper[5008]: I0318 18:53:32.199246 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:53:32 crc kubenswrapper[5008]: E0318 18:53:32.200214 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:53:44 crc kubenswrapper[5008]: I0318 18:53:44.203155 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:53:44 crc kubenswrapper[5008]: E0318 18:53:44.203983 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:53:57 crc kubenswrapper[5008]: I0318 18:53:57.199112 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:53:57 crc kubenswrapper[5008]: E0318 18:53:57.200101 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.159593 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564334-tqlpj"] Mar 18 18:54:00 crc kubenswrapper[5008]: E0318 18:54:00.160384 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerName="extract-content" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.160408 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerName="extract-content" Mar 18 18:54:00 crc kubenswrapper[5008]: E0318 18:54:00.160423 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerName="extract-utilities" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.160434 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerName="extract-utilities" Mar 18 18:54:00 crc kubenswrapper[5008]: E0318 18:54:00.160453 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerName="registry-server" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.160464 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerName="registry-server" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.161044 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f94fe51-2cba-4ebe-b9ec-4e8b2bb53ae0" containerName="registry-server" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.161796 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564334-tqlpj" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.166480 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.166799 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.167961 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.171605 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564334-tqlpj"] Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.335781 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9wpt\" (UniqueName: \"kubernetes.io/projected/7b2493a1-6f79-453b-896e-2b36a181a652-kube-api-access-z9wpt\") pod \"auto-csr-approver-29564334-tqlpj\" (UID: \"7b2493a1-6f79-453b-896e-2b36a181a652\") " pod="openshift-infra/auto-csr-approver-29564334-tqlpj" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.437888 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9wpt\" (UniqueName: \"kubernetes.io/projected/7b2493a1-6f79-453b-896e-2b36a181a652-kube-api-access-z9wpt\") pod \"auto-csr-approver-29564334-tqlpj\" (UID: \"7b2493a1-6f79-453b-896e-2b36a181a652\") " pod="openshift-infra/auto-csr-approver-29564334-tqlpj" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.466823 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9wpt\" (UniqueName: \"kubernetes.io/projected/7b2493a1-6f79-453b-896e-2b36a181a652-kube-api-access-z9wpt\") pod \"auto-csr-approver-29564334-tqlpj\" (UID: \"7b2493a1-6f79-453b-896e-2b36a181a652\") " pod="openshift-infra/auto-csr-approver-29564334-tqlpj" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.482722 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564334-tqlpj" Mar 18 18:54:00 crc kubenswrapper[5008]: I0318 18:54:00.926278 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564334-tqlpj"] Mar 18 18:54:01 crc kubenswrapper[5008]: I0318 18:54:01.616013 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564334-tqlpj" event={"ID":"7b2493a1-6f79-453b-896e-2b36a181a652","Type":"ContainerStarted","Data":"3ccfdd08c3173f3caf5cbf6232b6656f320e91e95a43a5f90131ebf7d4a5db96"} Mar 18 18:54:03 crc kubenswrapper[5008]: I0318 18:54:03.640092 5008 generic.go:334] "Generic (PLEG): container finished" podID="7b2493a1-6f79-453b-896e-2b36a181a652" containerID="6fcfabb6c561f32b59bc8ca9e187df3242a4999cea765dd32fe8a6484d93cb95" exitCode=0 Mar 18 18:54:03 crc kubenswrapper[5008]: I0318 18:54:03.640156 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564334-tqlpj" event={"ID":"7b2493a1-6f79-453b-896e-2b36a181a652","Type":"ContainerDied","Data":"6fcfabb6c561f32b59bc8ca9e187df3242a4999cea765dd32fe8a6484d93cb95"} Mar 18 18:54:05 crc kubenswrapper[5008]: I0318 18:54:05.062297 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564334-tqlpj" Mar 18 18:54:05 crc kubenswrapper[5008]: I0318 18:54:05.212501 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9wpt\" (UniqueName: \"kubernetes.io/projected/7b2493a1-6f79-453b-896e-2b36a181a652-kube-api-access-z9wpt\") pod \"7b2493a1-6f79-453b-896e-2b36a181a652\" (UID: \"7b2493a1-6f79-453b-896e-2b36a181a652\") " Mar 18 18:54:05 crc kubenswrapper[5008]: I0318 18:54:05.816772 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2493a1-6f79-453b-896e-2b36a181a652-kube-api-access-z9wpt" (OuterVolumeSpecName: "kube-api-access-z9wpt") pod "7b2493a1-6f79-453b-896e-2b36a181a652" (UID: "7b2493a1-6f79-453b-896e-2b36a181a652"). InnerVolumeSpecName "kube-api-access-z9wpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:54:05 crc kubenswrapper[5008]: I0318 18:54:05.826476 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9wpt\" (UniqueName: \"kubernetes.io/projected/7b2493a1-6f79-453b-896e-2b36a181a652-kube-api-access-z9wpt\") on node \"crc\" DevicePath \"\"" Mar 18 18:54:05 crc kubenswrapper[5008]: I0318 18:54:05.840318 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564334-tqlpj" event={"ID":"7b2493a1-6f79-453b-896e-2b36a181a652","Type":"ContainerDied","Data":"3ccfdd08c3173f3caf5cbf6232b6656f320e91e95a43a5f90131ebf7d4a5db96"} Mar 18 18:54:05 crc kubenswrapper[5008]: I0318 18:54:05.840384 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ccfdd08c3173f3caf5cbf6232b6656f320e91e95a43a5f90131ebf7d4a5db96" Mar 18 18:54:05 crc kubenswrapper[5008]: I0318 18:54:05.840406 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564334-tqlpj" Mar 18 18:54:06 crc kubenswrapper[5008]: I0318 18:54:06.896632 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564328-djpsv"] Mar 18 18:54:06 crc kubenswrapper[5008]: I0318 18:54:06.908196 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564328-djpsv"] Mar 18 18:54:08 crc kubenswrapper[5008]: I0318 18:54:08.214739 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb505c7-9e8c-4a09-9187-fb14bc31a590" path="/var/lib/kubelet/pods/ccb505c7-9e8c-4a09-9187-fb14bc31a590/volumes" Mar 18 18:54:10 crc kubenswrapper[5008]: I0318 18:54:10.199259 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:54:10 crc kubenswrapper[5008]: E0318 18:54:10.199846 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:54:24 crc kubenswrapper[5008]: I0318 18:54:24.203319 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:54:24 crc kubenswrapper[5008]: E0318 18:54:24.204513 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:54:36 crc kubenswrapper[5008]: I0318 18:54:36.199266 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:54:36 crc kubenswrapper[5008]: E0318 18:54:36.200279 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:54:41 crc kubenswrapper[5008]: I0318 18:54:41.656524 5008 scope.go:117] "RemoveContainer" containerID="b1f32e6cb0c79b069393df61652e98a85c9cc3a4363db8b0aac3ab81e8794450" Mar 18 18:54:50 crc kubenswrapper[5008]: I0318 18:54:50.198663 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:54:50 crc kubenswrapper[5008]: E0318 18:54:50.199966 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:55:04 crc kubenswrapper[5008]: I0318 18:55:04.205690 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:55:04 crc kubenswrapper[5008]: E0318 18:55:04.206914 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:55:17 crc kubenswrapper[5008]: I0318 18:55:17.198106 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:55:17 crc kubenswrapper[5008]: E0318 18:55:17.198776 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:55:32 crc kubenswrapper[5008]: I0318 18:55:32.198273 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:55:32 crc kubenswrapper[5008]: E0318 18:55:32.199216 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:55:47 crc kubenswrapper[5008]: I0318 18:55:47.199259 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:55:47 crc kubenswrapper[5008]: E0318 18:55:47.203115 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.217782 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564336-qchwh"] Mar 18 18:56:00 crc kubenswrapper[5008]: E0318 18:56:00.218879 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2493a1-6f79-453b-896e-2b36a181a652" containerName="oc" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.218902 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2493a1-6f79-453b-896e-2b36a181a652" containerName="oc" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.219190 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2493a1-6f79-453b-896e-2b36a181a652" containerName="oc" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.219936 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564336-qchwh" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.223788 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.224153 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.225821 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.232133 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564336-qchwh"] Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.392947 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fwh\" (UniqueName: \"kubernetes.io/projected/01c512e1-98d9-45c4-a62a-397ef227b76f-kube-api-access-b8fwh\") pod \"auto-csr-approver-29564336-qchwh\" (UID: \"01c512e1-98d9-45c4-a62a-397ef227b76f\") " pod="openshift-infra/auto-csr-approver-29564336-qchwh" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.494746 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fwh\" (UniqueName: \"kubernetes.io/projected/01c512e1-98d9-45c4-a62a-397ef227b76f-kube-api-access-b8fwh\") pod \"auto-csr-approver-29564336-qchwh\" (UID: \"01c512e1-98d9-45c4-a62a-397ef227b76f\") " pod="openshift-infra/auto-csr-approver-29564336-qchwh" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.519565 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fwh\" (UniqueName: \"kubernetes.io/projected/01c512e1-98d9-45c4-a62a-397ef227b76f-kube-api-access-b8fwh\") pod \"auto-csr-approver-29564336-qchwh\" (UID: \"01c512e1-98d9-45c4-a62a-397ef227b76f\") " pod="openshift-infra/auto-csr-approver-29564336-qchwh" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.544998 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564336-qchwh" Mar 18 18:56:00 crc kubenswrapper[5008]: I0318 18:56:00.990362 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564336-qchwh"] Mar 18 18:56:01 crc kubenswrapper[5008]: I0318 18:56:01.938665 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564336-qchwh" event={"ID":"01c512e1-98d9-45c4-a62a-397ef227b76f","Type":"ContainerStarted","Data":"f5bf29c70f530fac714fc3ae6ef9a23182d356f1655960550605f279ca57f0ff"} Mar 18 18:56:02 crc kubenswrapper[5008]: I0318 18:56:02.198710 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:56:02 crc kubenswrapper[5008]: E0318 18:56:02.198941 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:56:02 crc kubenswrapper[5008]: I0318 18:56:02.946484 5008 generic.go:334] "Generic (PLEG): container finished" podID="01c512e1-98d9-45c4-a62a-397ef227b76f" containerID="f38c8736c1a4e49ef40eef05ae16137367a1c030516d333d411814dc23921f9a" exitCode=0 Mar 18 18:56:02 crc kubenswrapper[5008]: I0318 18:56:02.946543 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564336-qchwh" event={"ID":"01c512e1-98d9-45c4-a62a-397ef227b76f","Type":"ContainerDied","Data":"f38c8736c1a4e49ef40eef05ae16137367a1c030516d333d411814dc23921f9a"} Mar 18 18:56:04 crc kubenswrapper[5008]: I0318 18:56:04.245347 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564336-qchwh" Mar 18 18:56:04 crc kubenswrapper[5008]: I0318 18:56:04.349210 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8fwh\" (UniqueName: \"kubernetes.io/projected/01c512e1-98d9-45c4-a62a-397ef227b76f-kube-api-access-b8fwh\") pod \"01c512e1-98d9-45c4-a62a-397ef227b76f\" (UID: \"01c512e1-98d9-45c4-a62a-397ef227b76f\") " Mar 18 18:56:04 crc kubenswrapper[5008]: I0318 18:56:04.356457 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c512e1-98d9-45c4-a62a-397ef227b76f-kube-api-access-b8fwh" (OuterVolumeSpecName: "kube-api-access-b8fwh") pod "01c512e1-98d9-45c4-a62a-397ef227b76f" (UID: "01c512e1-98d9-45c4-a62a-397ef227b76f"). InnerVolumeSpecName "kube-api-access-b8fwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:56:04 crc kubenswrapper[5008]: I0318 18:56:04.451277 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8fwh\" (UniqueName: \"kubernetes.io/projected/01c512e1-98d9-45c4-a62a-397ef227b76f-kube-api-access-b8fwh\") on node \"crc\" DevicePath \"\"" Mar 18 18:56:04 crc kubenswrapper[5008]: I0318 18:56:04.961404 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564336-qchwh" event={"ID":"01c512e1-98d9-45c4-a62a-397ef227b76f","Type":"ContainerDied","Data":"f5bf29c70f530fac714fc3ae6ef9a23182d356f1655960550605f279ca57f0ff"} Mar 18 18:56:04 crc kubenswrapper[5008]: I0318 18:56:04.961451 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5bf29c70f530fac714fc3ae6ef9a23182d356f1655960550605f279ca57f0ff" Mar 18 18:56:04 crc kubenswrapper[5008]: I0318 18:56:04.961470 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564336-qchwh" Mar 18 18:56:05 crc kubenswrapper[5008]: I0318 18:56:05.347760 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564330-s9lqc"] Mar 18 18:56:05 crc kubenswrapper[5008]: I0318 18:56:05.358733 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564330-s9lqc"] Mar 18 18:56:06 crc kubenswrapper[5008]: I0318 18:56:06.209855 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef74e457-30de-42f8-99f1-fcf9bc99e004" path="/var/lib/kubelet/pods/ef74e457-30de-42f8-99f1-fcf9bc99e004/volumes" Mar 18 18:56:14 crc kubenswrapper[5008]: I0318 18:56:14.198049 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:56:14 crc kubenswrapper[5008]: E0318 18:56:14.198811 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 18:56:25 crc kubenswrapper[5008]: I0318 18:56:25.198716 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 18:56:26 crc kubenswrapper[5008]: I0318 18:56:26.163074 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"1bf88bf8ac33b078694e8969100bad04b434510b86f4de3b740392ceba02f57d"} Mar 18 18:56:41 crc kubenswrapper[5008]: I0318 18:56:41.766177 5008 scope.go:117] "RemoveContainer" containerID="3f55aa2d0af21bf3bad2e036344e8f8d7af9e9adce0fe493e391c1b267644224" Mar 18 18:57:38 crc kubenswrapper[5008]: I0318 18:57:38.896452 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fkfxk"] Mar 18 18:57:38 crc kubenswrapper[5008]: E0318 18:57:38.897605 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c512e1-98d9-45c4-a62a-397ef227b76f" containerName="oc" Mar 18 18:57:38 crc kubenswrapper[5008]: I0318 18:57:38.897627 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c512e1-98d9-45c4-a62a-397ef227b76f" containerName="oc" Mar 18 18:57:38 crc kubenswrapper[5008]: I0318 18:57:38.897919 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="01c512e1-98d9-45c4-a62a-397ef227b76f" containerName="oc" Mar 18 18:57:38 crc kubenswrapper[5008]: I0318 18:57:38.899503 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:38 crc kubenswrapper[5008]: I0318 18:57:38.921546 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fkfxk"] Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.036040 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-utilities\") pod \"community-operators-fkfxk\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.036122 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-catalog-content\") pod \"community-operators-fkfxk\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.036185 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khdg\" (UniqueName: \"kubernetes.io/projected/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-kube-api-access-4khdg\") pod \"community-operators-fkfxk\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.137508 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-catalog-content\") pod \"community-operators-fkfxk\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.137630 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khdg\" (UniqueName: \"kubernetes.io/projected/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-kube-api-access-4khdg\") pod \"community-operators-fkfxk\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.137700 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-utilities\") pod \"community-operators-fkfxk\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.138162 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-utilities\") pod \"community-operators-fkfxk\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.138667 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-catalog-content\") pod \"community-operators-fkfxk\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.172217 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khdg\" (UniqueName: \"kubernetes.io/projected/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-kube-api-access-4khdg\") pod \"community-operators-fkfxk\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.235696 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:39 crc kubenswrapper[5008]: I0318 18:57:39.830097 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fkfxk"] Mar 18 18:57:40 crc kubenswrapper[5008]: I0318 18:57:40.847475 5008 generic.go:334] "Generic (PLEG): container finished" podID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerID="e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5" exitCode=0 Mar 18 18:57:40 crc kubenswrapper[5008]: I0318 18:57:40.847535 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkfxk" event={"ID":"c2546a6b-5e2f-4f47-b762-c2657c16d9bf","Type":"ContainerDied","Data":"e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5"} Mar 18 18:57:40 crc kubenswrapper[5008]: I0318 18:57:40.847875 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkfxk" event={"ID":"c2546a6b-5e2f-4f47-b762-c2657c16d9bf","Type":"ContainerStarted","Data":"bfb4bc6ba2f5b4db10ac25f711852626886957803d830ecc966327c82bd00d84"} Mar 18 18:57:40 crc kubenswrapper[5008]: I0318 18:57:40.850096 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 18:57:41 crc kubenswrapper[5008]: I0318 18:57:41.858855 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkfxk" event={"ID":"c2546a6b-5e2f-4f47-b762-c2657c16d9bf","Type":"ContainerStarted","Data":"1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443"} Mar 18 18:57:42 crc kubenswrapper[5008]: I0318 18:57:42.869867 5008 generic.go:334] "Generic (PLEG): container finished" podID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerID="1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443" exitCode=0 Mar 18 18:57:42 crc kubenswrapper[5008]: I0318 18:57:42.869910 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkfxk" event={"ID":"c2546a6b-5e2f-4f47-b762-c2657c16d9bf","Type":"ContainerDied","Data":"1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443"} Mar 18 18:57:43 crc kubenswrapper[5008]: I0318 18:57:43.880370 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkfxk" event={"ID":"c2546a6b-5e2f-4f47-b762-c2657c16d9bf","Type":"ContainerStarted","Data":"2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8"} Mar 18 18:57:43 crc kubenswrapper[5008]: I0318 18:57:43.910688 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fkfxk" podStartSLOduration=3.456415596 podStartE2EDuration="5.910669789s" podCreationTimestamp="2026-03-18 18:57:38 +0000 UTC" firstStartedPulling="2026-03-18 18:57:40.84937232 +0000 UTC m=+3317.368845429" lastFinishedPulling="2026-03-18 18:57:43.303626503 +0000 UTC m=+3319.823099622" observedRunningTime="2026-03-18 18:57:43.903714727 +0000 UTC m=+3320.423187816" watchObservedRunningTime="2026-03-18 18:57:43.910669789 +0000 UTC m=+3320.430142868" Mar 18 18:57:49 crc kubenswrapper[5008]: I0318 18:57:49.236697 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:49 crc kubenswrapper[5008]: I0318 18:57:49.237139 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:49 crc kubenswrapper[5008]: I0318 18:57:49.310854 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:49 crc kubenswrapper[5008]: I0318 18:57:49.989216 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:50 crc kubenswrapper[5008]: I0318 18:57:50.043168 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fkfxk"] Mar 18 18:57:51 crc kubenswrapper[5008]: I0318 18:57:51.949442 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fkfxk" podUID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerName="registry-server" containerID="cri-o://2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8" gracePeriod=2 Mar 18 18:57:52 crc kubenswrapper[5008]: I0318 18:57:52.956518 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:52 crc kubenswrapper[5008]: I0318 18:57:52.961484 5008 generic.go:334] "Generic (PLEG): container finished" podID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerID="2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8" exitCode=0 Mar 18 18:57:52 crc kubenswrapper[5008]: I0318 18:57:52.961518 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkfxk" event={"ID":"c2546a6b-5e2f-4f47-b762-c2657c16d9bf","Type":"ContainerDied","Data":"2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8"} Mar 18 18:57:52 crc kubenswrapper[5008]: I0318 18:57:52.961542 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkfxk" event={"ID":"c2546a6b-5e2f-4f47-b762-c2657c16d9bf","Type":"ContainerDied","Data":"bfb4bc6ba2f5b4db10ac25f711852626886957803d830ecc966327c82bd00d84"} Mar 18 18:57:52 crc kubenswrapper[5008]: I0318 18:57:52.961610 5008 scope.go:117] "RemoveContainer" containerID="2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8" Mar 18 18:57:52 crc kubenswrapper[5008]: I0318 18:57:52.961695 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkfxk" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.000167 5008 scope.go:117] "RemoveContainer" containerID="1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.021213 5008 scope.go:117] "RemoveContainer" containerID="e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.052775 5008 scope.go:117] "RemoveContainer" containerID="2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8" Mar 18 18:57:53 crc kubenswrapper[5008]: E0318 18:57:53.054172 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8\": container with ID starting with 2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8 not found: ID does not exist" containerID="2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.054236 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8"} err="failed to get container status \"2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8\": rpc error: code = NotFound desc = could not find container \"2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8\": container with ID starting with 2fdf05268c29cb73685a9ff3e7b53badf2760e47381f00009a54718477fef2e8 not found: ID does not exist" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.054278 5008 scope.go:117] "RemoveContainer" containerID="1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443" Mar 18 18:57:53 crc kubenswrapper[5008]: E0318 18:57:53.054835 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443\": container with ID starting with 1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443 not found: ID does not exist" containerID="1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.054905 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443"} err="failed to get container status \"1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443\": rpc error: code = NotFound desc = could not find container \"1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443\": container with ID starting with 1df890ba4f5ebd196e539f4a397d8274be6708651cb628adec29fa8cac4cc443 not found: ID does not exist" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.054948 5008 scope.go:117] "RemoveContainer" containerID="e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5" Mar 18 18:57:53 crc kubenswrapper[5008]: E0318 18:57:53.055325 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5\": container with ID starting with e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5 not found: ID does not exist" containerID="e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.055369 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5"} err="failed to get container status \"e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5\": rpc error: code = NotFound desc = could not find container \"e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5\": container with ID starting with e54c9a7c909d0a6332b64c2265184345148e239db7484929457b818452787da5 not found: ID does not exist" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.079407 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-catalog-content\") pod \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.079535 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-utilities\") pod \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.079637 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4khdg\" (UniqueName: \"kubernetes.io/projected/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-kube-api-access-4khdg\") pod \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\" (UID: \"c2546a6b-5e2f-4f47-b762-c2657c16d9bf\") " Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.081826 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-utilities" (OuterVolumeSpecName: "utilities") pod "c2546a6b-5e2f-4f47-b762-c2657c16d9bf" (UID: "c2546a6b-5e2f-4f47-b762-c2657c16d9bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.089100 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-kube-api-access-4khdg" (OuterVolumeSpecName: "kube-api-access-4khdg") pod "c2546a6b-5e2f-4f47-b762-c2657c16d9bf" (UID: "c2546a6b-5e2f-4f47-b762-c2657c16d9bf"). InnerVolumeSpecName "kube-api-access-4khdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.142710 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2546a6b-5e2f-4f47-b762-c2657c16d9bf" (UID: "c2546a6b-5e2f-4f47-b762-c2657c16d9bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.180918 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.180962 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4khdg\" (UniqueName: \"kubernetes.io/projected/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-kube-api-access-4khdg\") on node \"crc\" DevicePath \"\"" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.180974 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2546a6b-5e2f-4f47-b762-c2657c16d9bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.300658 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fkfxk"] Mar 18 18:57:53 crc kubenswrapper[5008]: I0318 18:57:53.309199 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fkfxk"] Mar 18 18:57:54 crc kubenswrapper[5008]: I0318 18:57:54.213841 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" path="/var/lib/kubelet/pods/c2546a6b-5e2f-4f47-b762-c2657c16d9bf/volumes" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.158514 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564338-zhbpt"] Mar 18 18:58:00 crc kubenswrapper[5008]: E0318 18:58:00.159470 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerName="extract-content" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.159491 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerName="extract-content" Mar 18 18:58:00 crc kubenswrapper[5008]: E0318 18:58:00.159517 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerName="registry-server" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.159530 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerName="registry-server" Mar 18 18:58:00 crc kubenswrapper[5008]: E0318 18:58:00.159703 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerName="extract-utilities" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.159716 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerName="extract-utilities" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.159880 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2546a6b-5e2f-4f47-b762-c2657c16d9bf" containerName="registry-server" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.160655 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564338-zhbpt" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.162963 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.163400 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.163618 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.175996 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564338-zhbpt"] Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.188999 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mprf5\" (UniqueName: \"kubernetes.io/projected/76db5842-8f74-450c-9965-9a1bc8ae06e1-kube-api-access-mprf5\") pod \"auto-csr-approver-29564338-zhbpt\" (UID: \"76db5842-8f74-450c-9965-9a1bc8ae06e1\") " pod="openshift-infra/auto-csr-approver-29564338-zhbpt" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.291723 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mprf5\" (UniqueName: \"kubernetes.io/projected/76db5842-8f74-450c-9965-9a1bc8ae06e1-kube-api-access-mprf5\") pod \"auto-csr-approver-29564338-zhbpt\" (UID: \"76db5842-8f74-450c-9965-9a1bc8ae06e1\") " pod="openshift-infra/auto-csr-approver-29564338-zhbpt" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.321358 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mprf5\" (UniqueName: \"kubernetes.io/projected/76db5842-8f74-450c-9965-9a1bc8ae06e1-kube-api-access-mprf5\") pod \"auto-csr-approver-29564338-zhbpt\" (UID: \"76db5842-8f74-450c-9965-9a1bc8ae06e1\") " pod="openshift-infra/auto-csr-approver-29564338-zhbpt" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.479805 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564338-zhbpt" Mar 18 18:58:00 crc kubenswrapper[5008]: I0318 18:58:00.733963 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564338-zhbpt"] Mar 18 18:58:01 crc kubenswrapper[5008]: I0318 18:58:01.044196 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564338-zhbpt" event={"ID":"76db5842-8f74-450c-9965-9a1bc8ae06e1","Type":"ContainerStarted","Data":"e7ff48364eb2e92a85c9e262a4f571ef775e2dc41d8b88972277d15af75f387f"} Mar 18 18:58:03 crc kubenswrapper[5008]: I0318 18:58:03.067753 5008 generic.go:334] "Generic (PLEG): container finished" podID="76db5842-8f74-450c-9965-9a1bc8ae06e1" containerID="d3a270235e76f186dbb8b385d2655225809071c83154c3d39d1e4facb97b922d" exitCode=0 Mar 18 18:58:03 crc kubenswrapper[5008]: I0318 18:58:03.068163 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564338-zhbpt" event={"ID":"76db5842-8f74-450c-9965-9a1bc8ae06e1","Type":"ContainerDied","Data":"d3a270235e76f186dbb8b385d2655225809071c83154c3d39d1e4facb97b922d"} Mar 18 18:58:04 crc kubenswrapper[5008]: I0318 18:58:04.380998 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564338-zhbpt" Mar 18 18:58:04 crc kubenswrapper[5008]: I0318 18:58:04.553788 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mprf5\" (UniqueName: \"kubernetes.io/projected/76db5842-8f74-450c-9965-9a1bc8ae06e1-kube-api-access-mprf5\") pod \"76db5842-8f74-450c-9965-9a1bc8ae06e1\" (UID: \"76db5842-8f74-450c-9965-9a1bc8ae06e1\") " Mar 18 18:58:04 crc kubenswrapper[5008]: I0318 18:58:04.561473 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76db5842-8f74-450c-9965-9a1bc8ae06e1-kube-api-access-mprf5" (OuterVolumeSpecName: "kube-api-access-mprf5") pod "76db5842-8f74-450c-9965-9a1bc8ae06e1" (UID: "76db5842-8f74-450c-9965-9a1bc8ae06e1"). InnerVolumeSpecName "kube-api-access-mprf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:58:04 crc kubenswrapper[5008]: I0318 18:58:04.655384 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mprf5\" (UniqueName: \"kubernetes.io/projected/76db5842-8f74-450c-9965-9a1bc8ae06e1-kube-api-access-mprf5\") on node \"crc\" DevicePath \"\"" Mar 18 18:58:05 crc kubenswrapper[5008]: I0318 18:58:05.092118 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564338-zhbpt" event={"ID":"76db5842-8f74-450c-9965-9a1bc8ae06e1","Type":"ContainerDied","Data":"e7ff48364eb2e92a85c9e262a4f571ef775e2dc41d8b88972277d15af75f387f"} Mar 18 18:58:05 crc kubenswrapper[5008]: I0318 18:58:05.092177 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7ff48364eb2e92a85c9e262a4f571ef775e2dc41d8b88972277d15af75f387f" Mar 18 18:58:05 crc kubenswrapper[5008]: I0318 18:58:05.092199 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564338-zhbpt" Mar 18 18:58:05 crc kubenswrapper[5008]: I0318 18:58:05.468950 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564332-86fnq"] Mar 18 18:58:05 crc kubenswrapper[5008]: I0318 18:58:05.475022 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564332-86fnq"] Mar 18 18:58:06 crc kubenswrapper[5008]: I0318 18:58:06.212053 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71aaed7d-973b-464e-83d7-cecddfe76eaf" path="/var/lib/kubelet/pods/71aaed7d-973b-464e-83d7-cecddfe76eaf/volumes" Mar 18 18:58:41 crc kubenswrapper[5008]: I0318 18:58:41.872082 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-grhhl"] Mar 18 18:58:41 crc kubenswrapper[5008]: E0318 18:58:41.873082 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76db5842-8f74-450c-9965-9a1bc8ae06e1" containerName="oc" Mar 18 18:58:41 crc kubenswrapper[5008]: I0318 18:58:41.873102 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="76db5842-8f74-450c-9965-9a1bc8ae06e1" containerName="oc" Mar 18 18:58:41 crc kubenswrapper[5008]: I0318 18:58:41.873374 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="76db5842-8f74-450c-9965-9a1bc8ae06e1" containerName="oc" Mar 18 18:58:41 crc kubenswrapper[5008]: I0318 18:58:41.875078 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:41 crc kubenswrapper[5008]: I0318 18:58:41.891660 5008 scope.go:117] "RemoveContainer" containerID="1f6e4b9e8897b15cbb90d82ee426883d2362ae29fc3883fb53f4aa6ed5610783" Mar 18 18:58:41 crc kubenswrapper[5008]: I0318 18:58:41.905174 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grhhl"] Mar 18 18:58:41 crc kubenswrapper[5008]: I0318 18:58:41.989778 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-utilities\") pod \"redhat-operators-grhhl\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:41 crc kubenswrapper[5008]: I0318 18:58:41.989879 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-catalog-content\") pod \"redhat-operators-grhhl\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:41 crc kubenswrapper[5008]: I0318 18:58:41.989952 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm4dd\" (UniqueName: \"kubernetes.io/projected/b79c3e40-d595-4959-ae5b-fd529e6f11c5-kube-api-access-zm4dd\") pod \"redhat-operators-grhhl\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:42 crc kubenswrapper[5008]: I0318 18:58:42.091533 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-catalog-content\") pod \"redhat-operators-grhhl\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:42 crc kubenswrapper[5008]: I0318 18:58:42.091648 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm4dd\" (UniqueName: \"kubernetes.io/projected/b79c3e40-d595-4959-ae5b-fd529e6f11c5-kube-api-access-zm4dd\") pod \"redhat-operators-grhhl\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:42 crc kubenswrapper[5008]: I0318 18:58:42.091696 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-utilities\") pod \"redhat-operators-grhhl\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:42 crc kubenswrapper[5008]: I0318 18:58:42.092273 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-catalog-content\") pod \"redhat-operators-grhhl\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:42 crc kubenswrapper[5008]: I0318 18:58:42.092280 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-utilities\") pod \"redhat-operators-grhhl\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:42 crc kubenswrapper[5008]: I0318 18:58:42.116877 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm4dd\" (UniqueName: \"kubernetes.io/projected/b79c3e40-d595-4959-ae5b-fd529e6f11c5-kube-api-access-zm4dd\") pod \"redhat-operators-grhhl\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:42 crc kubenswrapper[5008]: I0318 18:58:42.271840 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:42 crc kubenswrapper[5008]: I0318 18:58:42.703703 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grhhl"] Mar 18 18:58:43 crc kubenswrapper[5008]: I0318 18:58:43.447861 5008 generic.go:334] "Generic (PLEG): container finished" podID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerID="36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236" exitCode=0 Mar 18 18:58:43 crc kubenswrapper[5008]: I0318 18:58:43.447912 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grhhl" event={"ID":"b79c3e40-d595-4959-ae5b-fd529e6f11c5","Type":"ContainerDied","Data":"36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236"} Mar 18 18:58:43 crc kubenswrapper[5008]: I0318 18:58:43.448116 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grhhl" event={"ID":"b79c3e40-d595-4959-ae5b-fd529e6f11c5","Type":"ContainerStarted","Data":"40d97469ed3dbc8eac65bb64f1373f6402a165389fefd0534cd1441fd01238b0"} Mar 18 18:58:44 crc kubenswrapper[5008]: I0318 18:58:44.461532 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grhhl" event={"ID":"b79c3e40-d595-4959-ae5b-fd529e6f11c5","Type":"ContainerStarted","Data":"0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41"} Mar 18 18:58:45 crc kubenswrapper[5008]: I0318 18:58:45.471811 5008 generic.go:334] "Generic (PLEG): container finished" podID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerID="0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41" exitCode=0 Mar 18 18:58:45 crc kubenswrapper[5008]: I0318 18:58:45.471851 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grhhl" event={"ID":"b79c3e40-d595-4959-ae5b-fd529e6f11c5","Type":"ContainerDied","Data":"0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41"} Mar 18 18:58:46 crc kubenswrapper[5008]: I0318 18:58:46.480493 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grhhl" event={"ID":"b79c3e40-d595-4959-ae5b-fd529e6f11c5","Type":"ContainerStarted","Data":"aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c"} Mar 18 18:58:46 crc kubenswrapper[5008]: I0318 18:58:46.511402 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-grhhl" podStartSLOduration=2.98906337 podStartE2EDuration="5.511368771s" podCreationTimestamp="2026-03-18 18:58:41 +0000 UTC" firstStartedPulling="2026-03-18 18:58:43.45115388 +0000 UTC m=+3379.970626959" lastFinishedPulling="2026-03-18 18:58:45.973459241 +0000 UTC m=+3382.492932360" observedRunningTime="2026-03-18 18:58:46.497993832 +0000 UTC m=+3383.017466941" watchObservedRunningTime="2026-03-18 18:58:46.511368771 +0000 UTC m=+3383.030841890" Mar 18 18:58:52 crc kubenswrapper[5008]: I0318 18:58:52.272750 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:52 crc kubenswrapper[5008]: I0318 18:58:52.273355 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:58:53 crc kubenswrapper[5008]: I0318 18:58:53.332938 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-grhhl" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerName="registry-server" probeResult="failure" output=< Mar 18 18:58:53 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 18:58:53 crc kubenswrapper[5008]: > Mar 18 18:58:54 crc kubenswrapper[5008]: I0318 18:58:54.460040 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:58:54 crc kubenswrapper[5008]: I0318 18:58:54.460135 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:59:02 crc kubenswrapper[5008]: I0318 18:59:02.348746 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:59:02 crc kubenswrapper[5008]: I0318 18:59:02.429460 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:59:02 crc kubenswrapper[5008]: I0318 18:59:02.613399 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grhhl"] Mar 18 18:59:03 crc kubenswrapper[5008]: I0318 18:59:03.647155 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-grhhl" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerName="registry-server" containerID="cri-o://aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c" gracePeriod=2 Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.068997 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.228901 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm4dd\" (UniqueName: \"kubernetes.io/projected/b79c3e40-d595-4959-ae5b-fd529e6f11c5-kube-api-access-zm4dd\") pod \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.228987 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-utilities\") pod \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.229145 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-catalog-content\") pod \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\" (UID: \"b79c3e40-d595-4959-ae5b-fd529e6f11c5\") " Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.230223 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-utilities" (OuterVolumeSpecName: "utilities") pod "b79c3e40-d595-4959-ae5b-fd529e6f11c5" (UID: "b79c3e40-d595-4959-ae5b-fd529e6f11c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.236278 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79c3e40-d595-4959-ae5b-fd529e6f11c5-kube-api-access-zm4dd" (OuterVolumeSpecName: "kube-api-access-zm4dd") pod "b79c3e40-d595-4959-ae5b-fd529e6f11c5" (UID: "b79c3e40-d595-4959-ae5b-fd529e6f11c5"). InnerVolumeSpecName "kube-api-access-zm4dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.332473 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm4dd\" (UniqueName: \"kubernetes.io/projected/b79c3e40-d595-4959-ae5b-fd529e6f11c5-kube-api-access-zm4dd\") on node \"crc\" DevicePath \"\"" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.332523 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.405352 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b79c3e40-d595-4959-ae5b-fd529e6f11c5" (UID: "b79c3e40-d595-4959-ae5b-fd529e6f11c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.434012 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79c3e40-d595-4959-ae5b-fd529e6f11c5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.663835 5008 generic.go:334] "Generic (PLEG): container finished" podID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerID="aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c" exitCode=0 Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.663894 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grhhl" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.663918 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grhhl" event={"ID":"b79c3e40-d595-4959-ae5b-fd529e6f11c5","Type":"ContainerDied","Data":"aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c"} Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.663986 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grhhl" event={"ID":"b79c3e40-d595-4959-ae5b-fd529e6f11c5","Type":"ContainerDied","Data":"40d97469ed3dbc8eac65bb64f1373f6402a165389fefd0534cd1441fd01238b0"} Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.664038 5008 scope.go:117] "RemoveContainer" containerID="aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.696960 5008 scope.go:117] "RemoveContainer" containerID="0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.711509 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grhhl"] Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.719119 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-grhhl"] Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.784203 5008 scope.go:117] "RemoveContainer" containerID="36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.800745 5008 scope.go:117] "RemoveContainer" containerID="aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c" Mar 18 18:59:04 crc kubenswrapper[5008]: E0318 18:59:04.801657 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c\": container with ID starting with aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c not found: ID does not exist" containerID="aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.801698 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c"} err="failed to get container status \"aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c\": rpc error: code = NotFound desc = could not find container \"aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c\": container with ID starting with aaa4e4f66ff67a708263c008ef058e01f4602e37a88cdcbe5ed0b23665cf0e1c not found: ID does not exist" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.801718 5008 scope.go:117] "RemoveContainer" containerID="0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41" Mar 18 18:59:04 crc kubenswrapper[5008]: E0318 18:59:04.802073 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41\": container with ID starting with 0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41 not found: ID does not exist" containerID="0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.802130 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41"} err="failed to get container status \"0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41\": rpc error: code = NotFound desc = could not find container \"0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41\": container with ID starting with 0c1b96bec4bfc86e089546f905ad126a99186075f58b9f11802c0f316f495f41 not found: ID does not exist" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.802171 5008 scope.go:117] "RemoveContainer" containerID="36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236" Mar 18 18:59:04 crc kubenswrapper[5008]: E0318 18:59:04.802472 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236\": container with ID starting with 36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236 not found: ID does not exist" containerID="36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236" Mar 18 18:59:04 crc kubenswrapper[5008]: I0318 18:59:04.802507 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236"} err="failed to get container status \"36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236\": rpc error: code = NotFound desc = could not find container \"36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236\": container with ID starting with 36f3638a9e7aa9ad429b44466648fc2710f28dc441b21ecfe767d2b7a2754236 not found: ID does not exist" Mar 18 18:59:06 crc kubenswrapper[5008]: I0318 18:59:06.210336 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" path="/var/lib/kubelet/pods/b79c3e40-d595-4959-ae5b-fd529e6f11c5/volumes" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.098633 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sq65b"] Mar 18 18:59:20 crc kubenswrapper[5008]: E0318 18:59:20.100152 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerName="extract-content" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.100188 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerName="extract-content" Mar 18 18:59:20 crc kubenswrapper[5008]: E0318 18:59:20.100220 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerName="extract-utilities" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.100242 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerName="extract-utilities" Mar 18 18:59:20 crc kubenswrapper[5008]: E0318 18:59:20.100293 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerName="registry-server" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.100314 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerName="registry-server" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.100809 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79c3e40-d595-4959-ae5b-fd529e6f11c5" containerName="registry-server" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.105608 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.110980 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq65b"] Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.120756 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-utilities\") pod \"redhat-marketplace-sq65b\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.121259 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42np8\" (UniqueName: \"kubernetes.io/projected/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-kube-api-access-42np8\") pod \"redhat-marketplace-sq65b\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.121640 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-catalog-content\") pod \"redhat-marketplace-sq65b\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.222934 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-catalog-content\") pod \"redhat-marketplace-sq65b\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.223019 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-utilities\") pod \"redhat-marketplace-sq65b\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.223046 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42np8\" (UniqueName: \"kubernetes.io/projected/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-kube-api-access-42np8\") pod \"redhat-marketplace-sq65b\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.223613 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-catalog-content\") pod \"redhat-marketplace-sq65b\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.223974 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-utilities\") pod \"redhat-marketplace-sq65b\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.241803 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42np8\" (UniqueName: \"kubernetes.io/projected/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-kube-api-access-42np8\") pod \"redhat-marketplace-sq65b\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:20 crc kubenswrapper[5008]: I0318 18:59:20.441688 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:21 crc kubenswrapper[5008]: I0318 18:59:20.912959 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq65b"] Mar 18 18:59:21 crc kubenswrapper[5008]: W0318 18:59:20.918886 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1e1b334_ab8b_4ad2_8244_a51b520cf34d.slice/crio-bcb07bb470a5caefa17cce2cb99990ac18cd660a89fae96d75b6946774ed6b5f WatchSource:0}: Error finding container bcb07bb470a5caefa17cce2cb99990ac18cd660a89fae96d75b6946774ed6b5f: Status 404 returned error can't find the container with id bcb07bb470a5caefa17cce2cb99990ac18cd660a89fae96d75b6946774ed6b5f Mar 18 18:59:21 crc kubenswrapper[5008]: I0318 18:59:21.817118 5008 generic.go:334] "Generic (PLEG): container finished" podID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerID="5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf" exitCode=0 Mar 18 18:59:21 crc kubenswrapper[5008]: I0318 18:59:21.817197 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq65b" event={"ID":"d1e1b334-ab8b-4ad2-8244-a51b520cf34d","Type":"ContainerDied","Data":"5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf"} Mar 18 18:59:21 crc kubenswrapper[5008]: I0318 18:59:21.817489 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq65b" event={"ID":"d1e1b334-ab8b-4ad2-8244-a51b520cf34d","Type":"ContainerStarted","Data":"bcb07bb470a5caefa17cce2cb99990ac18cd660a89fae96d75b6946774ed6b5f"} Mar 18 18:59:22 crc kubenswrapper[5008]: I0318 18:59:22.829619 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq65b" event={"ID":"d1e1b334-ab8b-4ad2-8244-a51b520cf34d","Type":"ContainerStarted","Data":"cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d"} Mar 18 18:59:23 crc kubenswrapper[5008]: I0318 18:59:23.841785 5008 generic.go:334] "Generic (PLEG): container finished" podID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerID="cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d" exitCode=0 Mar 18 18:59:23 crc kubenswrapper[5008]: I0318 18:59:23.841842 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq65b" event={"ID":"d1e1b334-ab8b-4ad2-8244-a51b520cf34d","Type":"ContainerDied","Data":"cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d"} Mar 18 18:59:24 crc kubenswrapper[5008]: I0318 18:59:24.460641 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:59:24 crc kubenswrapper[5008]: I0318 18:59:24.460964 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:59:24 crc kubenswrapper[5008]: I0318 18:59:24.852764 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq65b" event={"ID":"d1e1b334-ab8b-4ad2-8244-a51b520cf34d","Type":"ContainerStarted","Data":"1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534"} Mar 18 18:59:24 crc kubenswrapper[5008]: I0318 18:59:24.870247 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sq65b" podStartSLOduration=2.446065342 podStartE2EDuration="4.870219929s" podCreationTimestamp="2026-03-18 18:59:20 +0000 UTC" firstStartedPulling="2026-03-18 18:59:21.819525117 +0000 UTC m=+3418.338998236" lastFinishedPulling="2026-03-18 18:59:24.243679744 +0000 UTC m=+3420.763152823" observedRunningTime="2026-03-18 18:59:24.867889578 +0000 UTC m=+3421.387362697" watchObservedRunningTime="2026-03-18 18:59:24.870219929 +0000 UTC m=+3421.389693038" Mar 18 18:59:30 crc kubenswrapper[5008]: I0318 18:59:30.442167 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:30 crc kubenswrapper[5008]: I0318 18:59:30.442671 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:30 crc kubenswrapper[5008]: I0318 18:59:30.519382 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:30 crc kubenswrapper[5008]: I0318 18:59:30.954478 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:31 crc kubenswrapper[5008]: I0318 18:59:31.009731 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq65b"] Mar 18 18:59:32 crc kubenswrapper[5008]: I0318 18:59:32.933384 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sq65b" podUID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerName="registry-server" containerID="cri-o://1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534" gracePeriod=2 Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.438541 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.621704 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42np8\" (UniqueName: \"kubernetes.io/projected/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-kube-api-access-42np8\") pod \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.621795 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-catalog-content\") pod \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.621824 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-utilities\") pod \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\" (UID: \"d1e1b334-ab8b-4ad2-8244-a51b520cf34d\") " Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.623150 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-utilities" (OuterVolumeSpecName: "utilities") pod "d1e1b334-ab8b-4ad2-8244-a51b520cf34d" (UID: "d1e1b334-ab8b-4ad2-8244-a51b520cf34d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.631579 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-kube-api-access-42np8" (OuterVolumeSpecName: "kube-api-access-42np8") pod "d1e1b334-ab8b-4ad2-8244-a51b520cf34d" (UID: "d1e1b334-ab8b-4ad2-8244-a51b520cf34d"). InnerVolumeSpecName "kube-api-access-42np8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.673284 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1e1b334-ab8b-4ad2-8244-a51b520cf34d" (UID: "d1e1b334-ab8b-4ad2-8244-a51b520cf34d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.723523 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42np8\" (UniqueName: \"kubernetes.io/projected/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-kube-api-access-42np8\") on node \"crc\" DevicePath \"\"" Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.723574 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.723589 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e1b334-ab8b-4ad2-8244-a51b520cf34d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.947462 5008 generic.go:334] "Generic (PLEG): container finished" podID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerID="1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534" exitCode=0 Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.947540 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq65b" Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.947540 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq65b" event={"ID":"d1e1b334-ab8b-4ad2-8244-a51b520cf34d","Type":"ContainerDied","Data":"1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534"} Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.948770 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq65b" event={"ID":"d1e1b334-ab8b-4ad2-8244-a51b520cf34d","Type":"ContainerDied","Data":"bcb07bb470a5caefa17cce2cb99990ac18cd660a89fae96d75b6946774ed6b5f"} Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.948911 5008 scope.go:117] "RemoveContainer" containerID="1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534" Mar 18 18:59:33 crc kubenswrapper[5008]: I0318 18:59:33.985633 5008 scope.go:117] "RemoveContainer" containerID="cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d" Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.010718 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq65b"] Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.019574 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq65b"] Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.023395 5008 scope.go:117] "RemoveContainer" containerID="5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf" Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.059963 5008 scope.go:117] "RemoveContainer" containerID="1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534" Mar 18 18:59:34 crc kubenswrapper[5008]: E0318 18:59:34.060624 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534\": container with ID starting with 1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534 not found: ID does not exist" containerID="1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534" Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.060673 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534"} err="failed to get container status \"1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534\": rpc error: code = NotFound desc = could not find container \"1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534\": container with ID starting with 1459dae44e8b2e1e59361765751f7b4dcca7dfe502b37f0181bce25f23738534 not found: ID does not exist" Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.060706 5008 scope.go:117] "RemoveContainer" containerID="cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d" Mar 18 18:59:34 crc kubenswrapper[5008]: E0318 18:59:34.061396 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d\": container with ID starting with cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d not found: ID does not exist" containerID="cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d" Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.061639 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d"} err="failed to get container status \"cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d\": rpc error: code = NotFound desc = could not find container \"cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d\": container with ID starting with cccd15d757b5772bb4426c78f216f102ff75311eb0a4bdff4d7cc723b63bc92d not found: ID does not exist" Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.061804 5008 scope.go:117] "RemoveContainer" containerID="5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf" Mar 18 18:59:34 crc kubenswrapper[5008]: E0318 18:59:34.062451 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf\": container with ID starting with 5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf not found: ID does not exist" containerID="5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf" Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.062719 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf"} err="failed to get container status \"5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf\": rpc error: code = NotFound desc = could not find container \"5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf\": container with ID starting with 5f29ae7afcfebef6b3301147431fbd68a3e863df620574209725ee8d2c46d7cf not found: ID does not exist" Mar 18 18:59:34 crc kubenswrapper[5008]: I0318 18:59:34.210375 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" path="/var/lib/kubelet/pods/d1e1b334-ab8b-4ad2-8244-a51b520cf34d/volumes" Mar 18 18:59:54 crc kubenswrapper[5008]: I0318 18:59:54.459926 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 18:59:54 crc kubenswrapper[5008]: I0318 18:59:54.460389 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 18:59:54 crc kubenswrapper[5008]: I0318 18:59:54.460434 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 18:59:54 crc kubenswrapper[5008]: I0318 18:59:54.460992 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1bf88bf8ac33b078694e8969100bad04b434510b86f4de3b740392ceba02f57d"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 18:59:54 crc kubenswrapper[5008]: I0318 18:59:54.461042 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://1bf88bf8ac33b078694e8969100bad04b434510b86f4de3b740392ceba02f57d" gracePeriod=600 Mar 18 18:59:55 crc kubenswrapper[5008]: I0318 18:59:55.157652 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="1bf88bf8ac33b078694e8969100bad04b434510b86f4de3b740392ceba02f57d" exitCode=0 Mar 18 18:59:55 crc kubenswrapper[5008]: I0318 18:59:55.157738 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"1bf88bf8ac33b078694e8969100bad04b434510b86f4de3b740392ceba02f57d"} Mar 18 18:59:55 crc kubenswrapper[5008]: I0318 18:59:55.158025 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c"} Mar 18 18:59:55 crc kubenswrapper[5008]: I0318 18:59:55.158077 5008 scope.go:117] "RemoveContainer" containerID="cd9821c813861b8c9387e65fba3e51148fa4d5115bba73782b38008ecae419ab" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.153272 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564340-lsngw"] Mar 18 19:00:00 crc kubenswrapper[5008]: E0318 19:00:00.154240 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerName="registry-server" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.154259 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerName="registry-server" Mar 18 19:00:00 crc kubenswrapper[5008]: E0318 19:00:00.154289 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerName="extract-content" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.154298 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerName="extract-content" Mar 18 19:00:00 crc kubenswrapper[5008]: E0318 19:00:00.154321 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerName="extract-utilities" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.154329 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerName="extract-utilities" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.154507 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e1b334-ab8b-4ad2-8244-a51b520cf34d" containerName="registry-server" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.155140 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564340-lsngw" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.157021 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.158517 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.158875 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.160878 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg"] Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.162037 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.165532 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.166419 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.172064 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564340-lsngw"] Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.192012 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg"] Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.275573 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-secret-volume\") pod \"collect-profiles-29564340-h6qhg\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.275839 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzdz6\" (UniqueName: \"kubernetes.io/projected/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-kube-api-access-jzdz6\") pod \"collect-profiles-29564340-h6qhg\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.275901 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-config-volume\") pod \"collect-profiles-29564340-h6qhg\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.276003 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-968kb\" (UniqueName: \"kubernetes.io/projected/fc851209-de68-41aa-9342-980b3c267cda-kube-api-access-968kb\") pod \"auto-csr-approver-29564340-lsngw\" (UID: \"fc851209-de68-41aa-9342-980b3c267cda\") " pod="openshift-infra/auto-csr-approver-29564340-lsngw" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.377008 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-secret-volume\") pod \"collect-profiles-29564340-h6qhg\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.377382 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzdz6\" (UniqueName: \"kubernetes.io/projected/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-kube-api-access-jzdz6\") pod \"collect-profiles-29564340-h6qhg\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.377580 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-config-volume\") pod \"collect-profiles-29564340-h6qhg\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.377874 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-968kb\" (UniqueName: \"kubernetes.io/projected/fc851209-de68-41aa-9342-980b3c267cda-kube-api-access-968kb\") pod \"auto-csr-approver-29564340-lsngw\" (UID: \"fc851209-de68-41aa-9342-980b3c267cda\") " pod="openshift-infra/auto-csr-approver-29564340-lsngw" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.378406 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-config-volume\") pod \"collect-profiles-29564340-h6qhg\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.385146 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-secret-volume\") pod \"collect-profiles-29564340-h6qhg\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.400671 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzdz6\" (UniqueName: \"kubernetes.io/projected/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-kube-api-access-jzdz6\") pod \"collect-profiles-29564340-h6qhg\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.410639 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-968kb\" (UniqueName: \"kubernetes.io/projected/fc851209-de68-41aa-9342-980b3c267cda-kube-api-access-968kb\") pod \"auto-csr-approver-29564340-lsngw\" (UID: \"fc851209-de68-41aa-9342-980b3c267cda\") " pod="openshift-infra/auto-csr-approver-29564340-lsngw" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.476238 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564340-lsngw" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.485980 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:00 crc kubenswrapper[5008]: I0318 19:00:00.955872 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564340-lsngw"] Mar 18 19:00:01 crc kubenswrapper[5008]: I0318 19:00:01.011163 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg"] Mar 18 19:00:01 crc kubenswrapper[5008]: W0318 19:00:01.013717 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2cfdd95_391e_4d2a_8d9b_13c5338d10d0.slice/crio-34083e6c1697b428c891cc1580084539b657ad416286ff0634e267e090674c08 WatchSource:0}: Error finding container 34083e6c1697b428c891cc1580084539b657ad416286ff0634e267e090674c08: Status 404 returned error can't find the container with id 34083e6c1697b428c891cc1580084539b657ad416286ff0634e267e090674c08 Mar 18 19:00:01 crc kubenswrapper[5008]: I0318 19:00:01.212065 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564340-lsngw" event={"ID":"fc851209-de68-41aa-9342-980b3c267cda","Type":"ContainerStarted","Data":"f3771cfbc0f249c460207ce06c9f544ca8e12c27ddd4779a396bf8fba4967947"} Mar 18 19:00:01 crc kubenswrapper[5008]: I0318 19:00:01.213683 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" event={"ID":"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0","Type":"ContainerStarted","Data":"dfd3f7ada93ea3d5db7952c051569b3cc97a570bd7acc7ebf050800616747abe"} Mar 18 19:00:01 crc kubenswrapper[5008]: I0318 19:00:01.213794 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" event={"ID":"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0","Type":"ContainerStarted","Data":"34083e6c1697b428c891cc1580084539b657ad416286ff0634e267e090674c08"} Mar 18 19:00:01 crc kubenswrapper[5008]: I0318 19:00:01.235041 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" podStartSLOduration=1.235023464 podStartE2EDuration="1.235023464s" podCreationTimestamp="2026-03-18 19:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:00:01.230588058 +0000 UTC m=+3457.750061127" watchObservedRunningTime="2026-03-18 19:00:01.235023464 +0000 UTC m=+3457.754496533" Mar 18 19:00:02 crc kubenswrapper[5008]: I0318 19:00:02.220771 5008 generic.go:334] "Generic (PLEG): container finished" podID="b2cfdd95-391e-4d2a-8d9b-13c5338d10d0" containerID="dfd3f7ada93ea3d5db7952c051569b3cc97a570bd7acc7ebf050800616747abe" exitCode=0 Mar 18 19:00:02 crc kubenswrapper[5008]: I0318 19:00:02.220862 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" event={"ID":"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0","Type":"ContainerDied","Data":"dfd3f7ada93ea3d5db7952c051569b3cc97a570bd7acc7ebf050800616747abe"} Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.520295 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.624642 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-secret-volume\") pod \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.624737 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzdz6\" (UniqueName: \"kubernetes.io/projected/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-kube-api-access-jzdz6\") pod \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.624996 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-config-volume\") pod \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\" (UID: \"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0\") " Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.625760 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "b2cfdd95-391e-4d2a-8d9b-13c5338d10d0" (UID: "b2cfdd95-391e-4d2a-8d9b-13c5338d10d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.626390 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.640076 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b2cfdd95-391e-4d2a-8d9b-13c5338d10d0" (UID: "b2cfdd95-391e-4d2a-8d9b-13c5338d10d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.640575 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-kube-api-access-jzdz6" (OuterVolumeSpecName: "kube-api-access-jzdz6") pod "b2cfdd95-391e-4d2a-8d9b-13c5338d10d0" (UID: "b2cfdd95-391e-4d2a-8d9b-13c5338d10d0"). InnerVolumeSpecName "kube-api-access-jzdz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.727724 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzdz6\" (UniqueName: \"kubernetes.io/projected/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-kube-api-access-jzdz6\") on node \"crc\" DevicePath \"\"" Mar 18 19:00:03 crc kubenswrapper[5008]: I0318 19:00:03.727761 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2cfdd95-391e-4d2a-8d9b-13c5338d10d0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:00:04 crc kubenswrapper[5008]: I0318 19:00:04.235069 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" event={"ID":"b2cfdd95-391e-4d2a-8d9b-13c5338d10d0","Type":"ContainerDied","Data":"34083e6c1697b428c891cc1580084539b657ad416286ff0634e267e090674c08"} Mar 18 19:00:04 crc kubenswrapper[5008]: I0318 19:00:04.235430 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34083e6c1697b428c891cc1580084539b657ad416286ff0634e267e090674c08" Mar 18 19:00:04 crc kubenswrapper[5008]: I0318 19:00:04.235145 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564340-h6qhg" Mar 18 19:00:04 crc kubenswrapper[5008]: I0318 19:00:04.295606 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx"] Mar 18 19:00:04 crc kubenswrapper[5008]: I0318 19:00:04.301452 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564295-k4stx"] Mar 18 19:00:06 crc kubenswrapper[5008]: I0318 19:00:06.209009 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cbb1f62-5be6-446b-b529-8d5137ab403d" path="/var/lib/kubelet/pods/6cbb1f62-5be6-446b-b529-8d5137ab403d/volumes" Mar 18 19:00:11 crc kubenswrapper[5008]: I0318 19:00:11.319047 5008 generic.go:334] "Generic (PLEG): container finished" podID="fc851209-de68-41aa-9342-980b3c267cda" containerID="bd7088d90e33dd17336bef159f60658e6cacbe2692d551dc863d25fc1a99d120" exitCode=0 Mar 18 19:00:11 crc kubenswrapper[5008]: I0318 19:00:11.319590 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564340-lsngw" event={"ID":"fc851209-de68-41aa-9342-980b3c267cda","Type":"ContainerDied","Data":"bd7088d90e33dd17336bef159f60658e6cacbe2692d551dc863d25fc1a99d120"} Mar 18 19:00:12 crc kubenswrapper[5008]: I0318 19:00:12.684638 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564340-lsngw" Mar 18 19:00:12 crc kubenswrapper[5008]: I0318 19:00:12.878177 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-968kb\" (UniqueName: \"kubernetes.io/projected/fc851209-de68-41aa-9342-980b3c267cda-kube-api-access-968kb\") pod \"fc851209-de68-41aa-9342-980b3c267cda\" (UID: \"fc851209-de68-41aa-9342-980b3c267cda\") " Mar 18 19:00:12 crc kubenswrapper[5008]: I0318 19:00:12.883864 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc851209-de68-41aa-9342-980b3c267cda-kube-api-access-968kb" (OuterVolumeSpecName: "kube-api-access-968kb") pod "fc851209-de68-41aa-9342-980b3c267cda" (UID: "fc851209-de68-41aa-9342-980b3c267cda"). InnerVolumeSpecName "kube-api-access-968kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:00:12 crc kubenswrapper[5008]: I0318 19:00:12.980455 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-968kb\" (UniqueName: \"kubernetes.io/projected/fc851209-de68-41aa-9342-980b3c267cda-kube-api-access-968kb\") on node \"crc\" DevicePath \"\"" Mar 18 19:00:13 crc kubenswrapper[5008]: I0318 19:00:13.337458 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564340-lsngw" event={"ID":"fc851209-de68-41aa-9342-980b3c267cda","Type":"ContainerDied","Data":"f3771cfbc0f249c460207ce06c9f544ca8e12c27ddd4779a396bf8fba4967947"} Mar 18 19:00:13 crc kubenswrapper[5008]: I0318 19:00:13.337501 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3771cfbc0f249c460207ce06c9f544ca8e12c27ddd4779a396bf8fba4967947" Mar 18 19:00:13 crc kubenswrapper[5008]: I0318 19:00:13.337526 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564340-lsngw" Mar 18 19:00:13 crc kubenswrapper[5008]: I0318 19:00:13.760492 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564334-tqlpj"] Mar 18 19:00:13 crc kubenswrapper[5008]: I0318 19:00:13.766496 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564334-tqlpj"] Mar 18 19:00:14 crc kubenswrapper[5008]: I0318 19:00:14.215485 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b2493a1-6f79-453b-896e-2b36a181a652" path="/var/lib/kubelet/pods/7b2493a1-6f79-453b-896e-2b36a181a652/volumes" Mar 18 19:00:42 crc kubenswrapper[5008]: I0318 19:00:42.075161 5008 scope.go:117] "RemoveContainer" containerID="2a2feddf140861300acf9920789a4be48f7fd0da7e743a1816618bf2746b0ead" Mar 18 19:00:42 crc kubenswrapper[5008]: I0318 19:00:42.110265 5008 scope.go:117] "RemoveContainer" containerID="6fcfabb6c561f32b59bc8ca9e187df3242a4999cea765dd32fe8a6484d93cb95" Mar 18 19:01:54 crc kubenswrapper[5008]: I0318 19:01:54.460228 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:01:54 crc kubenswrapper[5008]: I0318 19:01:54.460801 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.152856 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564342-kb52p"] Mar 18 19:02:00 crc kubenswrapper[5008]: E0318 19:02:00.153866 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc851209-de68-41aa-9342-980b3c267cda" containerName="oc" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.153887 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc851209-de68-41aa-9342-980b3c267cda" containerName="oc" Mar 18 19:02:00 crc kubenswrapper[5008]: E0318 19:02:00.153907 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2cfdd95-391e-4d2a-8d9b-13c5338d10d0" containerName="collect-profiles" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.153918 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2cfdd95-391e-4d2a-8d9b-13c5338d10d0" containerName="collect-profiles" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.154121 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2cfdd95-391e-4d2a-8d9b-13c5338d10d0" containerName="collect-profiles" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.154151 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc851209-de68-41aa-9342-980b3c267cda" containerName="oc" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.155015 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564342-kb52p" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.162089 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.162187 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.162759 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.209360 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564342-kb52p"] Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.275434 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6n68\" (UniqueName: \"kubernetes.io/projected/dace82ea-138d-410a-a0c0-61e591994595-kube-api-access-d6n68\") pod \"auto-csr-approver-29564342-kb52p\" (UID: \"dace82ea-138d-410a-a0c0-61e591994595\") " pod="openshift-infra/auto-csr-approver-29564342-kb52p" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.377046 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6n68\" (UniqueName: \"kubernetes.io/projected/dace82ea-138d-410a-a0c0-61e591994595-kube-api-access-d6n68\") pod \"auto-csr-approver-29564342-kb52p\" (UID: \"dace82ea-138d-410a-a0c0-61e591994595\") " pod="openshift-infra/auto-csr-approver-29564342-kb52p" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.408713 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6n68\" (UniqueName: \"kubernetes.io/projected/dace82ea-138d-410a-a0c0-61e591994595-kube-api-access-d6n68\") pod \"auto-csr-approver-29564342-kb52p\" (UID: \"dace82ea-138d-410a-a0c0-61e591994595\") " pod="openshift-infra/auto-csr-approver-29564342-kb52p" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.504381 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564342-kb52p" Mar 18 19:02:00 crc kubenswrapper[5008]: I0318 19:02:00.962408 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564342-kb52p"] Mar 18 19:02:01 crc kubenswrapper[5008]: I0318 19:02:01.269935 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564342-kb52p" event={"ID":"dace82ea-138d-410a-a0c0-61e591994595","Type":"ContainerStarted","Data":"86c0fd9cf1816c7865c52b3fcee0a5459cddbb4438850d24750f6a47ed75af5f"} Mar 18 19:02:03 crc kubenswrapper[5008]: I0318 19:02:03.289747 5008 generic.go:334] "Generic (PLEG): container finished" podID="dace82ea-138d-410a-a0c0-61e591994595" containerID="2d7e628a9eeffcd12a0a81c3f2e975b63f449659509166139041aac6675d02e3" exitCode=0 Mar 18 19:02:03 crc kubenswrapper[5008]: I0318 19:02:03.289833 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564342-kb52p" event={"ID":"dace82ea-138d-410a-a0c0-61e591994595","Type":"ContainerDied","Data":"2d7e628a9eeffcd12a0a81c3f2e975b63f449659509166139041aac6675d02e3"} Mar 18 19:02:04 crc kubenswrapper[5008]: I0318 19:02:04.659329 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564342-kb52p" Mar 18 19:02:04 crc kubenswrapper[5008]: I0318 19:02:04.744225 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6n68\" (UniqueName: \"kubernetes.io/projected/dace82ea-138d-410a-a0c0-61e591994595-kube-api-access-d6n68\") pod \"dace82ea-138d-410a-a0c0-61e591994595\" (UID: \"dace82ea-138d-410a-a0c0-61e591994595\") " Mar 18 19:02:04 crc kubenswrapper[5008]: I0318 19:02:04.755768 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dace82ea-138d-410a-a0c0-61e591994595-kube-api-access-d6n68" (OuterVolumeSpecName: "kube-api-access-d6n68") pod "dace82ea-138d-410a-a0c0-61e591994595" (UID: "dace82ea-138d-410a-a0c0-61e591994595"). InnerVolumeSpecName "kube-api-access-d6n68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:02:04 crc kubenswrapper[5008]: I0318 19:02:04.846202 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6n68\" (UniqueName: \"kubernetes.io/projected/dace82ea-138d-410a-a0c0-61e591994595-kube-api-access-d6n68\") on node \"crc\" DevicePath \"\"" Mar 18 19:02:05 crc kubenswrapper[5008]: I0318 19:02:05.308484 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564342-kb52p" event={"ID":"dace82ea-138d-410a-a0c0-61e591994595","Type":"ContainerDied","Data":"86c0fd9cf1816c7865c52b3fcee0a5459cddbb4438850d24750f6a47ed75af5f"} Mar 18 19:02:05 crc kubenswrapper[5008]: I0318 19:02:05.308537 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564342-kb52p" Mar 18 19:02:05 crc kubenswrapper[5008]: I0318 19:02:05.308539 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86c0fd9cf1816c7865c52b3fcee0a5459cddbb4438850d24750f6a47ed75af5f" Mar 18 19:02:05 crc kubenswrapper[5008]: I0318 19:02:05.739212 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564336-qchwh"] Mar 18 19:02:05 crc kubenswrapper[5008]: I0318 19:02:05.750165 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564336-qchwh"] Mar 18 19:02:06 crc kubenswrapper[5008]: I0318 19:02:06.214276 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c512e1-98d9-45c4-a62a-397ef227b76f" path="/var/lib/kubelet/pods/01c512e1-98d9-45c4-a62a-397ef227b76f/volumes" Mar 18 19:02:24 crc kubenswrapper[5008]: I0318 19:02:24.460618 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:02:24 crc kubenswrapper[5008]: I0318 19:02:24.461150 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:02:42 crc kubenswrapper[5008]: I0318 19:02:42.261928 5008 scope.go:117] "RemoveContainer" containerID="f38c8736c1a4e49ef40eef05ae16137367a1c030516d333d411814dc23921f9a" Mar 18 19:02:54 crc kubenswrapper[5008]: I0318 19:02:54.460495 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:02:54 crc kubenswrapper[5008]: I0318 19:02:54.461133 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:02:54 crc kubenswrapper[5008]: I0318 19:02:54.461193 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 19:02:54 crc kubenswrapper[5008]: I0318 19:02:54.461722 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:02:54 crc kubenswrapper[5008]: I0318 19:02:54.461782 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" gracePeriod=600 Mar 18 19:02:54 crc kubenswrapper[5008]: E0318 19:02:54.590034 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:02:54 crc kubenswrapper[5008]: I0318 19:02:54.794493 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" exitCode=0 Mar 18 19:02:54 crc kubenswrapper[5008]: I0318 19:02:54.794543 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c"} Mar 18 19:02:54 crc kubenswrapper[5008]: I0318 19:02:54.794622 5008 scope.go:117] "RemoveContainer" containerID="1bf88bf8ac33b078694e8969100bad04b434510b86f4de3b740392ceba02f57d" Mar 18 19:02:54 crc kubenswrapper[5008]: I0318 19:02:54.795197 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:02:54 crc kubenswrapper[5008]: E0318 19:02:54.795593 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.625347 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rsh7s"] Mar 18 19:02:56 crc kubenswrapper[5008]: E0318 19:02:56.626353 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dace82ea-138d-410a-a0c0-61e591994595" containerName="oc" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.626377 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="dace82ea-138d-410a-a0c0-61e591994595" containerName="oc" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.626879 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="dace82ea-138d-410a-a0c0-61e591994595" containerName="oc" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.629590 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.650818 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rsh7s"] Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.806049 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-catalog-content\") pod \"certified-operators-rsh7s\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.806139 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhqsh\" (UniqueName: \"kubernetes.io/projected/3bd2cd98-9af3-4ce4-9939-57c47c28b437-kube-api-access-fhqsh\") pod \"certified-operators-rsh7s\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.806473 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-utilities\") pod \"certified-operators-rsh7s\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.907968 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-utilities\") pod \"certified-operators-rsh7s\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.908022 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-catalog-content\") pod \"certified-operators-rsh7s\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.908076 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhqsh\" (UniqueName: \"kubernetes.io/projected/3bd2cd98-9af3-4ce4-9939-57c47c28b437-kube-api-access-fhqsh\") pod \"certified-operators-rsh7s\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.908736 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-catalog-content\") pod \"certified-operators-rsh7s\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.908763 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-utilities\") pod \"certified-operators-rsh7s\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.929624 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhqsh\" (UniqueName: \"kubernetes.io/projected/3bd2cd98-9af3-4ce4-9939-57c47c28b437-kube-api-access-fhqsh\") pod \"certified-operators-rsh7s\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:56 crc kubenswrapper[5008]: I0318 19:02:56.962824 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:02:57 crc kubenswrapper[5008]: I0318 19:02:57.434723 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rsh7s"] Mar 18 19:02:57 crc kubenswrapper[5008]: I0318 19:02:57.827136 5008 generic.go:334] "Generic (PLEG): container finished" podID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerID="f89ec4633e341c137d37aaf725cea8bf4ee7b4fce458e6a59c074ff4299aa30e" exitCode=0 Mar 18 19:02:57 crc kubenswrapper[5008]: I0318 19:02:57.827207 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsh7s" event={"ID":"3bd2cd98-9af3-4ce4-9939-57c47c28b437","Type":"ContainerDied","Data":"f89ec4633e341c137d37aaf725cea8bf4ee7b4fce458e6a59c074ff4299aa30e"} Mar 18 19:02:57 crc kubenswrapper[5008]: I0318 19:02:57.827430 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsh7s" event={"ID":"3bd2cd98-9af3-4ce4-9939-57c47c28b437","Type":"ContainerStarted","Data":"c82fc02077dec445c5f32ea76d92e884e57725c97db2777519a6f1eb253166ac"} Mar 18 19:02:57 crc kubenswrapper[5008]: I0318 19:02:57.829636 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:02:59 crc kubenswrapper[5008]: I0318 19:02:59.848777 5008 generic.go:334] "Generic (PLEG): container finished" podID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerID="f21a0dcde224f7746a338acca2afcabb6590d94cf6624f2c0eb26fc05677c4bd" exitCode=0 Mar 18 19:02:59 crc kubenswrapper[5008]: I0318 19:02:59.848831 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsh7s" event={"ID":"3bd2cd98-9af3-4ce4-9939-57c47c28b437","Type":"ContainerDied","Data":"f21a0dcde224f7746a338acca2afcabb6590d94cf6624f2c0eb26fc05677c4bd"} Mar 18 19:03:00 crc kubenswrapper[5008]: I0318 19:03:00.858956 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsh7s" event={"ID":"3bd2cd98-9af3-4ce4-9939-57c47c28b437","Type":"ContainerStarted","Data":"d33bee62db3a08fba9c6c9f54f0b82b6f5f220971e87b0dfdc4c6a39f7ffaa8f"} Mar 18 19:03:00 crc kubenswrapper[5008]: I0318 19:03:00.890222 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rsh7s" podStartSLOduration=2.455581613 podStartE2EDuration="4.890202418s" podCreationTimestamp="2026-03-18 19:02:56 +0000 UTC" firstStartedPulling="2026-03-18 19:02:57.82897502 +0000 UTC m=+3634.348448149" lastFinishedPulling="2026-03-18 19:03:00.263595865 +0000 UTC m=+3636.783068954" observedRunningTime="2026-03-18 19:03:00.884712813 +0000 UTC m=+3637.404185912" watchObservedRunningTime="2026-03-18 19:03:00.890202418 +0000 UTC m=+3637.409675507" Mar 18 19:03:06 crc kubenswrapper[5008]: I0318 19:03:06.198017 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:03:06 crc kubenswrapper[5008]: E0318 19:03:06.198914 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:03:06 crc kubenswrapper[5008]: I0318 19:03:06.963693 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:03:06 crc kubenswrapper[5008]: I0318 19:03:06.963749 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:03:07 crc kubenswrapper[5008]: I0318 19:03:07.014401 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:03:07 crc kubenswrapper[5008]: I0318 19:03:07.957835 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:03:08 crc kubenswrapper[5008]: I0318 19:03:08.008360 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rsh7s"] Mar 18 19:03:09 crc kubenswrapper[5008]: I0318 19:03:09.942446 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rsh7s" podUID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerName="registry-server" containerID="cri-o://d33bee62db3a08fba9c6c9f54f0b82b6f5f220971e87b0dfdc4c6a39f7ffaa8f" gracePeriod=2 Mar 18 19:03:10 crc kubenswrapper[5008]: I0318 19:03:10.956598 5008 generic.go:334] "Generic (PLEG): container finished" podID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerID="d33bee62db3a08fba9c6c9f54f0b82b6f5f220971e87b0dfdc4c6a39f7ffaa8f" exitCode=0 Mar 18 19:03:10 crc kubenswrapper[5008]: I0318 19:03:10.956846 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsh7s" event={"ID":"3bd2cd98-9af3-4ce4-9939-57c47c28b437","Type":"ContainerDied","Data":"d33bee62db3a08fba9c6c9f54f0b82b6f5f220971e87b0dfdc4c6a39f7ffaa8f"} Mar 18 19:03:10 crc kubenswrapper[5008]: I0318 19:03:10.957058 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsh7s" event={"ID":"3bd2cd98-9af3-4ce4-9939-57c47c28b437","Type":"ContainerDied","Data":"c82fc02077dec445c5f32ea76d92e884e57725c97db2777519a6f1eb253166ac"} Mar 18 19:03:10 crc kubenswrapper[5008]: I0318 19:03:10.957084 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c82fc02077dec445c5f32ea76d92e884e57725c97db2777519a6f1eb253166ac" Mar 18 19:03:10 crc kubenswrapper[5008]: I0318 19:03:10.957584 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.077852 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-utilities\") pod \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.077993 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-catalog-content\") pod \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.078054 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhqsh\" (UniqueName: \"kubernetes.io/projected/3bd2cd98-9af3-4ce4-9939-57c47c28b437-kube-api-access-fhqsh\") pod \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\" (UID: \"3bd2cd98-9af3-4ce4-9939-57c47c28b437\") " Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.079835 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-utilities" (OuterVolumeSpecName: "utilities") pod "3bd2cd98-9af3-4ce4-9939-57c47c28b437" (UID: "3bd2cd98-9af3-4ce4-9939-57c47c28b437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.085315 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd2cd98-9af3-4ce4-9939-57c47c28b437-kube-api-access-fhqsh" (OuterVolumeSpecName: "kube-api-access-fhqsh") pod "3bd2cd98-9af3-4ce4-9939-57c47c28b437" (UID: "3bd2cd98-9af3-4ce4-9939-57c47c28b437"). InnerVolumeSpecName "kube-api-access-fhqsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.160066 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bd2cd98-9af3-4ce4-9939-57c47c28b437" (UID: "3bd2cd98-9af3-4ce4-9939-57c47c28b437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.179663 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.179694 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhqsh\" (UniqueName: \"kubernetes.io/projected/3bd2cd98-9af3-4ce4-9939-57c47c28b437-kube-api-access-fhqsh\") on node \"crc\" DevicePath \"\"" Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.179705 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2cd98-9af3-4ce4-9939-57c47c28b437-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:03:11 crc kubenswrapper[5008]: I0318 19:03:11.966292 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsh7s" Mar 18 19:03:12 crc kubenswrapper[5008]: I0318 19:03:12.017600 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rsh7s"] Mar 18 19:03:12 crc kubenswrapper[5008]: I0318 19:03:12.028006 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rsh7s"] Mar 18 19:03:12 crc kubenswrapper[5008]: I0318 19:03:12.214170 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" path="/var/lib/kubelet/pods/3bd2cd98-9af3-4ce4-9939-57c47c28b437/volumes" Mar 18 19:03:20 crc kubenswrapper[5008]: I0318 19:03:20.198017 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:03:20 crc kubenswrapper[5008]: E0318 19:03:20.198517 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:03:35 crc kubenswrapper[5008]: I0318 19:03:35.198403 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:03:35 crc kubenswrapper[5008]: E0318 19:03:35.200871 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:03:49 crc kubenswrapper[5008]: I0318 19:03:49.198531 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:03:49 crc kubenswrapper[5008]: E0318 19:03:49.199357 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.141807 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564344-5vmfj"] Mar 18 19:04:00 crc kubenswrapper[5008]: E0318 19:04:00.142481 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerName="extract-content" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.142493 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerName="extract-content" Mar 18 19:04:00 crc kubenswrapper[5008]: E0318 19:04:00.142506 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerName="extract-utilities" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.142514 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerName="extract-utilities" Mar 18 19:04:00 crc kubenswrapper[5008]: E0318 19:04:00.142525 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerName="registry-server" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.142531 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerName="registry-server" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.142683 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd2cd98-9af3-4ce4-9939-57c47c28b437" containerName="registry-server" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.143163 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564344-5vmfj" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.146650 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.146654 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.146703 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.151945 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564344-5vmfj"] Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.268278 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkb47\" (UniqueName: \"kubernetes.io/projected/e71724ff-f0f7-409f-beb4-40eb4adbe13c-kube-api-access-mkb47\") pod \"auto-csr-approver-29564344-5vmfj\" (UID: \"e71724ff-f0f7-409f-beb4-40eb4adbe13c\") " pod="openshift-infra/auto-csr-approver-29564344-5vmfj" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.369767 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkb47\" (UniqueName: \"kubernetes.io/projected/e71724ff-f0f7-409f-beb4-40eb4adbe13c-kube-api-access-mkb47\") pod \"auto-csr-approver-29564344-5vmfj\" (UID: \"e71724ff-f0f7-409f-beb4-40eb4adbe13c\") " pod="openshift-infra/auto-csr-approver-29564344-5vmfj" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.387015 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkb47\" (UniqueName: \"kubernetes.io/projected/e71724ff-f0f7-409f-beb4-40eb4adbe13c-kube-api-access-mkb47\") pod \"auto-csr-approver-29564344-5vmfj\" (UID: \"e71724ff-f0f7-409f-beb4-40eb4adbe13c\") " pod="openshift-infra/auto-csr-approver-29564344-5vmfj" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.462802 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564344-5vmfj" Mar 18 19:04:00 crc kubenswrapper[5008]: I0318 19:04:00.902239 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564344-5vmfj"] Mar 18 19:04:01 crc kubenswrapper[5008]: I0318 19:04:01.399171 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564344-5vmfj" event={"ID":"e71724ff-f0f7-409f-beb4-40eb4adbe13c","Type":"ContainerStarted","Data":"027258f8fb15cda2dae782212d47e3842a7ff6ece80e524e0aecfb5de273d6dd"} Mar 18 19:04:02 crc kubenswrapper[5008]: I0318 19:04:02.197810 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:04:02 crc kubenswrapper[5008]: E0318 19:04:02.198101 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:04:03 crc kubenswrapper[5008]: I0318 19:04:03.413710 5008 generic.go:334] "Generic (PLEG): container finished" podID="e71724ff-f0f7-409f-beb4-40eb4adbe13c" containerID="46c31244dea16e91b912e40d135480273c7ae08199af3736e565ad5f265cf8f4" exitCode=0 Mar 18 19:04:03 crc kubenswrapper[5008]: I0318 19:04:03.413793 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564344-5vmfj" event={"ID":"e71724ff-f0f7-409f-beb4-40eb4adbe13c","Type":"ContainerDied","Data":"46c31244dea16e91b912e40d135480273c7ae08199af3736e565ad5f265cf8f4"} Mar 18 19:04:04 crc kubenswrapper[5008]: I0318 19:04:04.723338 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564344-5vmfj" Mar 18 19:04:04 crc kubenswrapper[5008]: I0318 19:04:04.829981 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkb47\" (UniqueName: \"kubernetes.io/projected/e71724ff-f0f7-409f-beb4-40eb4adbe13c-kube-api-access-mkb47\") pod \"e71724ff-f0f7-409f-beb4-40eb4adbe13c\" (UID: \"e71724ff-f0f7-409f-beb4-40eb4adbe13c\") " Mar 18 19:04:05 crc kubenswrapper[5008]: I0318 19:04:05.174447 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71724ff-f0f7-409f-beb4-40eb4adbe13c-kube-api-access-mkb47" (OuterVolumeSpecName: "kube-api-access-mkb47") pod "e71724ff-f0f7-409f-beb4-40eb4adbe13c" (UID: "e71724ff-f0f7-409f-beb4-40eb4adbe13c"). InnerVolumeSpecName "kube-api-access-mkb47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:04:05 crc kubenswrapper[5008]: I0318 19:04:05.236020 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkb47\" (UniqueName: \"kubernetes.io/projected/e71724ff-f0f7-409f-beb4-40eb4adbe13c-kube-api-access-mkb47\") on node \"crc\" DevicePath \"\"" Mar 18 19:04:05 crc kubenswrapper[5008]: I0318 19:04:05.429689 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564344-5vmfj" event={"ID":"e71724ff-f0f7-409f-beb4-40eb4adbe13c","Type":"ContainerDied","Data":"027258f8fb15cda2dae782212d47e3842a7ff6ece80e524e0aecfb5de273d6dd"} Mar 18 19:04:05 crc kubenswrapper[5008]: I0318 19:04:05.429732 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="027258f8fb15cda2dae782212d47e3842a7ff6ece80e524e0aecfb5de273d6dd" Mar 18 19:04:05 crc kubenswrapper[5008]: I0318 19:04:05.429753 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564344-5vmfj" Mar 18 19:04:05 crc kubenswrapper[5008]: I0318 19:04:05.797478 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564338-zhbpt"] Mar 18 19:04:05 crc kubenswrapper[5008]: I0318 19:04:05.805157 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564338-zhbpt"] Mar 18 19:04:06 crc kubenswrapper[5008]: I0318 19:04:06.215226 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76db5842-8f74-450c-9965-9a1bc8ae06e1" path="/var/lib/kubelet/pods/76db5842-8f74-450c-9965-9a1bc8ae06e1/volumes" Mar 18 19:04:16 crc kubenswrapper[5008]: I0318 19:04:16.198395 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:04:16 crc kubenswrapper[5008]: E0318 19:04:16.199136 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:04:27 crc kubenswrapper[5008]: I0318 19:04:27.198479 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:04:27 crc kubenswrapper[5008]: E0318 19:04:27.199276 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:04:40 crc kubenswrapper[5008]: I0318 19:04:40.199015 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:04:40 crc kubenswrapper[5008]: E0318 19:04:40.199871 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:04:42 crc kubenswrapper[5008]: I0318 19:04:42.400881 5008 scope.go:117] "RemoveContainer" containerID="d3a270235e76f186dbb8b385d2655225809071c83154c3d39d1e4facb97b922d" Mar 18 19:04:51 crc kubenswrapper[5008]: I0318 19:04:51.198291 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:04:51 crc kubenswrapper[5008]: E0318 19:04:51.199102 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:05:06 crc kubenswrapper[5008]: I0318 19:05:06.200368 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:05:06 crc kubenswrapper[5008]: E0318 19:05:06.201521 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:05:20 crc kubenswrapper[5008]: I0318 19:05:20.198830 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:05:20 crc kubenswrapper[5008]: E0318 19:05:20.201636 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:05:31 crc kubenswrapper[5008]: I0318 19:05:31.199108 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:05:31 crc kubenswrapper[5008]: E0318 19:05:31.199977 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:05:42 crc kubenswrapper[5008]: I0318 19:05:42.199543 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:05:42 crc kubenswrapper[5008]: E0318 19:05:42.200675 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:05:54 crc kubenswrapper[5008]: I0318 19:05:54.206033 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:05:54 crc kubenswrapper[5008]: E0318 19:05:54.207012 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.153539 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564346-n55l7"] Mar 18 19:06:00 crc kubenswrapper[5008]: E0318 19:06:00.154647 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71724ff-f0f7-409f-beb4-40eb4adbe13c" containerName="oc" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.154667 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71724ff-f0f7-409f-beb4-40eb4adbe13c" containerName="oc" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.154910 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71724ff-f0f7-409f-beb4-40eb4adbe13c" containerName="oc" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.155595 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564346-n55l7" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.157601 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.159261 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.168431 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.170952 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564346-n55l7"] Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.291247 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bvbz\" (UniqueName: \"kubernetes.io/projected/23a3f922-a281-4b02-be27-354a0a8c20df-kube-api-access-7bvbz\") pod \"auto-csr-approver-29564346-n55l7\" (UID: \"23a3f922-a281-4b02-be27-354a0a8c20df\") " pod="openshift-infra/auto-csr-approver-29564346-n55l7" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.393036 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bvbz\" (UniqueName: \"kubernetes.io/projected/23a3f922-a281-4b02-be27-354a0a8c20df-kube-api-access-7bvbz\") pod \"auto-csr-approver-29564346-n55l7\" (UID: \"23a3f922-a281-4b02-be27-354a0a8c20df\") " pod="openshift-infra/auto-csr-approver-29564346-n55l7" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.427860 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bvbz\" (UniqueName: \"kubernetes.io/projected/23a3f922-a281-4b02-be27-354a0a8c20df-kube-api-access-7bvbz\") pod \"auto-csr-approver-29564346-n55l7\" (UID: \"23a3f922-a281-4b02-be27-354a0a8c20df\") " pod="openshift-infra/auto-csr-approver-29564346-n55l7" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.523600 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564346-n55l7" Mar 18 19:06:00 crc kubenswrapper[5008]: I0318 19:06:00.968807 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564346-n55l7"] Mar 18 19:06:01 crc kubenswrapper[5008]: I0318 19:06:01.498487 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564346-n55l7" event={"ID":"23a3f922-a281-4b02-be27-354a0a8c20df","Type":"ContainerStarted","Data":"50515e5d7db9495f305e10fd13b85748d224565750418d211b202177534e785c"} Mar 18 19:06:02 crc kubenswrapper[5008]: I0318 19:06:02.506372 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564346-n55l7" event={"ID":"23a3f922-a281-4b02-be27-354a0a8c20df","Type":"ContainerStarted","Data":"ebdf5ae40e66d37a3a0c516853396ee0e346b241b68adbe3dc32d4ede30cfbb9"} Mar 18 19:06:02 crc kubenswrapper[5008]: I0318 19:06:02.524452 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564346-n55l7" podStartSLOduration=1.381122048 podStartE2EDuration="2.524424532s" podCreationTimestamp="2026-03-18 19:06:00 +0000 UTC" firstStartedPulling="2026-03-18 19:06:00.983755567 +0000 UTC m=+3817.503228646" lastFinishedPulling="2026-03-18 19:06:02.127058021 +0000 UTC m=+3818.646531130" observedRunningTime="2026-03-18 19:06:02.517999592 +0000 UTC m=+3819.037472671" watchObservedRunningTime="2026-03-18 19:06:02.524424532 +0000 UTC m=+3819.043897661" Mar 18 19:06:03 crc kubenswrapper[5008]: I0318 19:06:03.516782 5008 generic.go:334] "Generic (PLEG): container finished" podID="23a3f922-a281-4b02-be27-354a0a8c20df" containerID="ebdf5ae40e66d37a3a0c516853396ee0e346b241b68adbe3dc32d4ede30cfbb9" exitCode=0 Mar 18 19:06:03 crc kubenswrapper[5008]: I0318 19:06:03.516840 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564346-n55l7" event={"ID":"23a3f922-a281-4b02-be27-354a0a8c20df","Type":"ContainerDied","Data":"ebdf5ae40e66d37a3a0c516853396ee0e346b241b68adbe3dc32d4ede30cfbb9"} Mar 18 19:06:04 crc kubenswrapper[5008]: I0318 19:06:04.863748 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564346-n55l7" Mar 18 19:06:04 crc kubenswrapper[5008]: I0318 19:06:04.960239 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bvbz\" (UniqueName: \"kubernetes.io/projected/23a3f922-a281-4b02-be27-354a0a8c20df-kube-api-access-7bvbz\") pod \"23a3f922-a281-4b02-be27-354a0a8c20df\" (UID: \"23a3f922-a281-4b02-be27-354a0a8c20df\") " Mar 18 19:06:04 crc kubenswrapper[5008]: I0318 19:06:04.964694 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a3f922-a281-4b02-be27-354a0a8c20df-kube-api-access-7bvbz" (OuterVolumeSpecName: "kube-api-access-7bvbz") pod "23a3f922-a281-4b02-be27-354a0a8c20df" (UID: "23a3f922-a281-4b02-be27-354a0a8c20df"). InnerVolumeSpecName "kube-api-access-7bvbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:06:05 crc kubenswrapper[5008]: I0318 19:06:05.062913 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bvbz\" (UniqueName: \"kubernetes.io/projected/23a3f922-a281-4b02-be27-354a0a8c20df-kube-api-access-7bvbz\") on node \"crc\" DevicePath \"\"" Mar 18 19:06:05 crc kubenswrapper[5008]: I0318 19:06:05.549867 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564346-n55l7" Mar 18 19:06:05 crc kubenswrapper[5008]: I0318 19:06:05.549786 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564346-n55l7" event={"ID":"23a3f922-a281-4b02-be27-354a0a8c20df","Type":"ContainerDied","Data":"50515e5d7db9495f305e10fd13b85748d224565750418d211b202177534e785c"} Mar 18 19:06:05 crc kubenswrapper[5008]: I0318 19:06:05.550015 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50515e5d7db9495f305e10fd13b85748d224565750418d211b202177534e785c" Mar 18 19:06:05 crc kubenswrapper[5008]: I0318 19:06:05.593425 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564340-lsngw"] Mar 18 19:06:05 crc kubenswrapper[5008]: I0318 19:06:05.601190 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564340-lsngw"] Mar 18 19:06:06 crc kubenswrapper[5008]: I0318 19:06:06.207441 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc851209-de68-41aa-9342-980b3c267cda" path="/var/lib/kubelet/pods/fc851209-de68-41aa-9342-980b3c267cda/volumes" Mar 18 19:06:07 crc kubenswrapper[5008]: I0318 19:06:07.198654 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:06:07 crc kubenswrapper[5008]: E0318 19:06:07.199792 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:06:20 crc kubenswrapper[5008]: I0318 19:06:20.198689 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:06:20 crc kubenswrapper[5008]: E0318 19:06:20.200539 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:06:32 crc kubenswrapper[5008]: I0318 19:06:32.199217 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:06:32 crc kubenswrapper[5008]: E0318 19:06:32.200304 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:06:42 crc kubenswrapper[5008]: I0318 19:06:42.515719 5008 scope.go:117] "RemoveContainer" containerID="bd7088d90e33dd17336bef159f60658e6cacbe2692d551dc863d25fc1a99d120" Mar 18 19:06:44 crc kubenswrapper[5008]: I0318 19:06:44.205941 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:06:44 crc kubenswrapper[5008]: E0318 19:06:44.206857 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:06:57 crc kubenswrapper[5008]: I0318 19:06:57.199810 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:06:57 crc kubenswrapper[5008]: E0318 19:06:57.200352 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:07:08 crc kubenswrapper[5008]: I0318 19:07:08.200041 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:07:08 crc kubenswrapper[5008]: E0318 19:07:08.201360 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:07:23 crc kubenswrapper[5008]: I0318 19:07:23.199030 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:07:23 crc kubenswrapper[5008]: E0318 19:07:23.200136 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:07:37 crc kubenswrapper[5008]: I0318 19:07:37.198890 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:07:37 crc kubenswrapper[5008]: E0318 19:07:37.199972 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:07:48 crc kubenswrapper[5008]: I0318 19:07:48.198502 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:07:48 crc kubenswrapper[5008]: E0318 19:07:48.199315 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.163777 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564348-ljs2z"] Mar 18 19:08:00 crc kubenswrapper[5008]: E0318 19:08:00.164765 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a3f922-a281-4b02-be27-354a0a8c20df" containerName="oc" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.164788 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a3f922-a281-4b02-be27-354a0a8c20df" containerName="oc" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.165037 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a3f922-a281-4b02-be27-354a0a8c20df" containerName="oc" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.165675 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564348-ljs2z" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.167675 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.167719 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.168497 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.187080 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564348-ljs2z"] Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.261417 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzdcn\" (UniqueName: \"kubernetes.io/projected/30c14cdb-1d5c-4fab-b0f0-72c992899c8e-kube-api-access-nzdcn\") pod \"auto-csr-approver-29564348-ljs2z\" (UID: \"30c14cdb-1d5c-4fab-b0f0-72c992899c8e\") " pod="openshift-infra/auto-csr-approver-29564348-ljs2z" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.363023 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzdcn\" (UniqueName: \"kubernetes.io/projected/30c14cdb-1d5c-4fab-b0f0-72c992899c8e-kube-api-access-nzdcn\") pod \"auto-csr-approver-29564348-ljs2z\" (UID: \"30c14cdb-1d5c-4fab-b0f0-72c992899c8e\") " pod="openshift-infra/auto-csr-approver-29564348-ljs2z" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.383898 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzdcn\" (UniqueName: \"kubernetes.io/projected/30c14cdb-1d5c-4fab-b0f0-72c992899c8e-kube-api-access-nzdcn\") pod \"auto-csr-approver-29564348-ljs2z\" (UID: \"30c14cdb-1d5c-4fab-b0f0-72c992899c8e\") " pod="openshift-infra/auto-csr-approver-29564348-ljs2z" Mar 18 19:08:00 crc kubenswrapper[5008]: I0318 19:08:00.508145 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564348-ljs2z" Mar 18 19:08:01 crc kubenswrapper[5008]: I0318 19:08:01.052390 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564348-ljs2z"] Mar 18 19:08:01 crc kubenswrapper[5008]: I0318 19:08:01.053469 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:08:01 crc kubenswrapper[5008]: I0318 19:08:01.198400 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:08:01 crc kubenswrapper[5008]: I0318 19:08:01.557475 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"72a6b748f99aec5a0e48585405946db8a40fd1cec8156208fa6b54edd146499b"} Mar 18 19:08:01 crc kubenswrapper[5008]: I0318 19:08:01.558485 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564348-ljs2z" event={"ID":"30c14cdb-1d5c-4fab-b0f0-72c992899c8e","Type":"ContainerStarted","Data":"dc3a314bd91ec4e26f2c14f748e1073ea12a4ea7a90ccbbe7435e90f9288da30"} Mar 18 19:08:02 crc kubenswrapper[5008]: I0318 19:08:02.568363 5008 generic.go:334] "Generic (PLEG): container finished" podID="30c14cdb-1d5c-4fab-b0f0-72c992899c8e" containerID="1d3432ee28fc8cbfeae7ae631489b862ffdf795aacc107244206920842bf193e" exitCode=0 Mar 18 19:08:02 crc kubenswrapper[5008]: I0318 19:08:02.568443 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564348-ljs2z" event={"ID":"30c14cdb-1d5c-4fab-b0f0-72c992899c8e","Type":"ContainerDied","Data":"1d3432ee28fc8cbfeae7ae631489b862ffdf795aacc107244206920842bf193e"} Mar 18 19:08:03 crc kubenswrapper[5008]: I0318 19:08:03.911376 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564348-ljs2z" Mar 18 19:08:04 crc kubenswrapper[5008]: I0318 19:08:04.035851 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzdcn\" (UniqueName: \"kubernetes.io/projected/30c14cdb-1d5c-4fab-b0f0-72c992899c8e-kube-api-access-nzdcn\") pod \"30c14cdb-1d5c-4fab-b0f0-72c992899c8e\" (UID: \"30c14cdb-1d5c-4fab-b0f0-72c992899c8e\") " Mar 18 19:08:04 crc kubenswrapper[5008]: I0318 19:08:04.043806 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c14cdb-1d5c-4fab-b0f0-72c992899c8e-kube-api-access-nzdcn" (OuterVolumeSpecName: "kube-api-access-nzdcn") pod "30c14cdb-1d5c-4fab-b0f0-72c992899c8e" (UID: "30c14cdb-1d5c-4fab-b0f0-72c992899c8e"). InnerVolumeSpecName "kube-api-access-nzdcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:08:04 crc kubenswrapper[5008]: I0318 19:08:04.138264 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzdcn\" (UniqueName: \"kubernetes.io/projected/30c14cdb-1d5c-4fab-b0f0-72c992899c8e-kube-api-access-nzdcn\") on node \"crc\" DevicePath \"\"" Mar 18 19:08:04 crc kubenswrapper[5008]: I0318 19:08:04.593935 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564348-ljs2z" event={"ID":"30c14cdb-1d5c-4fab-b0f0-72c992899c8e","Type":"ContainerDied","Data":"dc3a314bd91ec4e26f2c14f748e1073ea12a4ea7a90ccbbe7435e90f9288da30"} Mar 18 19:08:04 crc kubenswrapper[5008]: I0318 19:08:04.594281 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc3a314bd91ec4e26f2c14f748e1073ea12a4ea7a90ccbbe7435e90f9288da30" Mar 18 19:08:04 crc kubenswrapper[5008]: I0318 19:08:04.593975 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564348-ljs2z" Mar 18 19:08:04 crc kubenswrapper[5008]: I0318 19:08:04.998133 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564342-kb52p"] Mar 18 19:08:05 crc kubenswrapper[5008]: I0318 19:08:05.004499 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564342-kb52p"] Mar 18 19:08:06 crc kubenswrapper[5008]: I0318 19:08:06.209126 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dace82ea-138d-410a-a0c0-61e591994595" path="/var/lib/kubelet/pods/dace82ea-138d-410a-a0c0-61e591994595/volumes" Mar 18 19:08:42 crc kubenswrapper[5008]: I0318 19:08:42.635793 5008 scope.go:117] "RemoveContainer" containerID="2d7e628a9eeffcd12a0a81c3f2e975b63f449659509166139041aac6675d02e3" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.093497 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w5jrx"] Mar 18 19:09:23 crc kubenswrapper[5008]: E0318 19:09:23.094865 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c14cdb-1d5c-4fab-b0f0-72c992899c8e" containerName="oc" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.094885 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c14cdb-1d5c-4fab-b0f0-72c992899c8e" containerName="oc" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.095316 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c14cdb-1d5c-4fab-b0f0-72c992899c8e" containerName="oc" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.097687 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.116143 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5jrx"] Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.236232 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-catalog-content\") pod \"redhat-marketplace-w5jrx\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.236284 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-utilities\") pod \"redhat-marketplace-w5jrx\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.236530 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfd8g\" (UniqueName: \"kubernetes.io/projected/0a7702ee-f2bb-440e-970a-07228e4d0d32-kube-api-access-jfd8g\") pod \"redhat-marketplace-w5jrx\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.338347 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfd8g\" (UniqueName: \"kubernetes.io/projected/0a7702ee-f2bb-440e-970a-07228e4d0d32-kube-api-access-jfd8g\") pod \"redhat-marketplace-w5jrx\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.338517 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-catalog-content\") pod \"redhat-marketplace-w5jrx\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.338549 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-utilities\") pod \"redhat-marketplace-w5jrx\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.339217 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-utilities\") pod \"redhat-marketplace-w5jrx\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.339490 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-catalog-content\") pod \"redhat-marketplace-w5jrx\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.366656 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfd8g\" (UniqueName: \"kubernetes.io/projected/0a7702ee-f2bb-440e-970a-07228e4d0d32-kube-api-access-jfd8g\") pod \"redhat-marketplace-w5jrx\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.436774 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:23 crc kubenswrapper[5008]: I0318 19:09:23.911779 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5jrx"] Mar 18 19:09:24 crc kubenswrapper[5008]: I0318 19:09:24.288343 5008 generic.go:334] "Generic (PLEG): container finished" podID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerID="a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857" exitCode=0 Mar 18 19:09:24 crc kubenswrapper[5008]: I0318 19:09:24.288392 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5jrx" event={"ID":"0a7702ee-f2bb-440e-970a-07228e4d0d32","Type":"ContainerDied","Data":"a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857"} Mar 18 19:09:24 crc kubenswrapper[5008]: I0318 19:09:24.288427 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5jrx" event={"ID":"0a7702ee-f2bb-440e-970a-07228e4d0d32","Type":"ContainerStarted","Data":"ca4463a023bd8e3e84737fd7873f95ef6179efc085d3f0fb4145c93674cc7082"} Mar 18 19:09:25 crc kubenswrapper[5008]: I0318 19:09:25.295367 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5jrx" event={"ID":"0a7702ee-f2bb-440e-970a-07228e4d0d32","Type":"ContainerStarted","Data":"5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740"} Mar 18 19:09:26 crc kubenswrapper[5008]: I0318 19:09:26.304442 5008 generic.go:334] "Generic (PLEG): container finished" podID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerID="5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740" exitCode=0 Mar 18 19:09:26 crc kubenswrapper[5008]: I0318 19:09:26.304480 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5jrx" event={"ID":"0a7702ee-f2bb-440e-970a-07228e4d0d32","Type":"ContainerDied","Data":"5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740"} Mar 18 19:09:27 crc kubenswrapper[5008]: I0318 19:09:27.315209 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5jrx" event={"ID":"0a7702ee-f2bb-440e-970a-07228e4d0d32","Type":"ContainerStarted","Data":"9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d"} Mar 18 19:09:27 crc kubenswrapper[5008]: I0318 19:09:27.337096 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w5jrx" podStartSLOduration=1.855625019 podStartE2EDuration="4.337065641s" podCreationTimestamp="2026-03-18 19:09:23 +0000 UTC" firstStartedPulling="2026-03-18 19:09:24.291643969 +0000 UTC m=+4020.811117048" lastFinishedPulling="2026-03-18 19:09:26.773084571 +0000 UTC m=+4023.292557670" observedRunningTime="2026-03-18 19:09:27.332855971 +0000 UTC m=+4023.852329060" watchObservedRunningTime="2026-03-18 19:09:27.337065641 +0000 UTC m=+4023.856538760" Mar 18 19:09:33 crc kubenswrapper[5008]: I0318 19:09:33.437506 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:33 crc kubenswrapper[5008]: I0318 19:09:33.437908 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:33 crc kubenswrapper[5008]: I0318 19:09:33.636909 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:34 crc kubenswrapper[5008]: I0318 19:09:34.436372 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:34 crc kubenswrapper[5008]: I0318 19:09:34.490506 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5jrx"] Mar 18 19:09:36 crc kubenswrapper[5008]: I0318 19:09:36.396811 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w5jrx" podUID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerName="registry-server" containerID="cri-o://9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d" gracePeriod=2 Mar 18 19:09:36 crc kubenswrapper[5008]: I0318 19:09:36.894943 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:36 crc kubenswrapper[5008]: I0318 19:09:36.946383 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfd8g\" (UniqueName: \"kubernetes.io/projected/0a7702ee-f2bb-440e-970a-07228e4d0d32-kube-api-access-jfd8g\") pod \"0a7702ee-f2bb-440e-970a-07228e4d0d32\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " Mar 18 19:09:36 crc kubenswrapper[5008]: I0318 19:09:36.946768 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-catalog-content\") pod \"0a7702ee-f2bb-440e-970a-07228e4d0d32\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " Mar 18 19:09:36 crc kubenswrapper[5008]: I0318 19:09:36.946875 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-utilities\") pod \"0a7702ee-f2bb-440e-970a-07228e4d0d32\" (UID: \"0a7702ee-f2bb-440e-970a-07228e4d0d32\") " Mar 18 19:09:36 crc kubenswrapper[5008]: I0318 19:09:36.950371 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-utilities" (OuterVolumeSpecName: "utilities") pod "0a7702ee-f2bb-440e-970a-07228e4d0d32" (UID: "0a7702ee-f2bb-440e-970a-07228e4d0d32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:09:36 crc kubenswrapper[5008]: I0318 19:09:36.953333 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7702ee-f2bb-440e-970a-07228e4d0d32-kube-api-access-jfd8g" (OuterVolumeSpecName: "kube-api-access-jfd8g") pod "0a7702ee-f2bb-440e-970a-07228e4d0d32" (UID: "0a7702ee-f2bb-440e-970a-07228e4d0d32"). InnerVolumeSpecName "kube-api-access-jfd8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:09:36 crc kubenswrapper[5008]: I0318 19:09:36.986230 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a7702ee-f2bb-440e-970a-07228e4d0d32" (UID: "0a7702ee-f2bb-440e-970a-07228e4d0d32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.048352 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.048393 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a7702ee-f2bb-440e-970a-07228e4d0d32-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.048407 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfd8g\" (UniqueName: \"kubernetes.io/projected/0a7702ee-f2bb-440e-970a-07228e4d0d32-kube-api-access-jfd8g\") on node \"crc\" DevicePath \"\"" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.406416 5008 generic.go:334] "Generic (PLEG): container finished" podID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerID="9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d" exitCode=0 Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.406493 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5jrx" event={"ID":"0a7702ee-f2bb-440e-970a-07228e4d0d32","Type":"ContainerDied","Data":"9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d"} Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.406535 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5jrx" event={"ID":"0a7702ee-f2bb-440e-970a-07228e4d0d32","Type":"ContainerDied","Data":"ca4463a023bd8e3e84737fd7873f95ef6179efc085d3f0fb4145c93674cc7082"} Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.406604 5008 scope.go:117] "RemoveContainer" containerID="9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.406849 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5jrx" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.440132 5008 scope.go:117] "RemoveContainer" containerID="5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.469189 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5jrx"] Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.480887 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5jrx"] Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.481768 5008 scope.go:117] "RemoveContainer" containerID="a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.520866 5008 scope.go:117] "RemoveContainer" containerID="9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d" Mar 18 19:09:37 crc kubenswrapper[5008]: E0318 19:09:37.521533 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d\": container with ID starting with 9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d not found: ID does not exist" containerID="9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.521646 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d"} err="failed to get container status \"9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d\": rpc error: code = NotFound desc = could not find container \"9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d\": container with ID starting with 9765167435c3712915f721102b965091e9aa30ab5b28ef2abb4578667aa5728d not found: ID does not exist" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.521678 5008 scope.go:117] "RemoveContainer" containerID="5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740" Mar 18 19:09:37 crc kubenswrapper[5008]: E0318 19:09:37.522458 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740\": container with ID starting with 5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740 not found: ID does not exist" containerID="5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.522520 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740"} err="failed to get container status \"5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740\": rpc error: code = NotFound desc = could not find container \"5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740\": container with ID starting with 5f4857f8e2df52ea679a4ae95de54aef017d9cab580c9242648e07107651b740 not found: ID does not exist" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.522542 5008 scope.go:117] "RemoveContainer" containerID="a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857" Mar 18 19:09:37 crc kubenswrapper[5008]: E0318 19:09:37.522899 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857\": container with ID starting with a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857 not found: ID does not exist" containerID="a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857" Mar 18 19:09:37 crc kubenswrapper[5008]: I0318 19:09:37.522971 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857"} err="failed to get container status \"a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857\": rpc error: code = NotFound desc = could not find container \"a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857\": container with ID starting with a50d48b7f2725f37be1f433e05278b536492449c64a4bbe7affb8306a0011857 not found: ID does not exist" Mar 18 19:09:38 crc kubenswrapper[5008]: I0318 19:09:38.207016 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7702ee-f2bb-440e-970a-07228e4d0d32" path="/var/lib/kubelet/pods/0a7702ee-f2bb-440e-970a-07228e4d0d32/volumes" Mar 18 19:09:42 crc kubenswrapper[5008]: I0318 19:09:42.728876 5008 scope.go:117] "RemoveContainer" containerID="f89ec4633e341c137d37aaf725cea8bf4ee7b4fce458e6a59c074ff4299aa30e" Mar 18 19:09:42 crc kubenswrapper[5008]: I0318 19:09:42.754179 5008 scope.go:117] "RemoveContainer" containerID="d33bee62db3a08fba9c6c9f54f0b82b6f5f220971e87b0dfdc4c6a39f7ffaa8f" Mar 18 19:09:42 crc kubenswrapper[5008]: I0318 19:09:42.781204 5008 scope.go:117] "RemoveContainer" containerID="f21a0dcde224f7746a338acca2afcabb6590d94cf6624f2c0eb26fc05677c4bd" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.170465 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564350-msdmw"] Mar 18 19:10:00 crc kubenswrapper[5008]: E0318 19:10:00.171610 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerName="registry-server" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.171633 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerName="registry-server" Mar 18 19:10:00 crc kubenswrapper[5008]: E0318 19:10:00.171651 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerName="extract-utilities" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.171666 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerName="extract-utilities" Mar 18 19:10:00 crc kubenswrapper[5008]: E0318 19:10:00.171685 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerName="extract-content" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.171697 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerName="extract-content" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.171958 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7702ee-f2bb-440e-970a-07228e4d0d32" containerName="registry-server" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.172706 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564350-msdmw" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.176896 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.176927 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.178225 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.194140 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564350-msdmw"] Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.231429 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvv6v\" (UniqueName: \"kubernetes.io/projected/0ab807b4-57f7-4a89-9960-6168ded9ee75-kube-api-access-dvv6v\") pod \"auto-csr-approver-29564350-msdmw\" (UID: \"0ab807b4-57f7-4a89-9960-6168ded9ee75\") " pod="openshift-infra/auto-csr-approver-29564350-msdmw" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.332857 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvv6v\" (UniqueName: \"kubernetes.io/projected/0ab807b4-57f7-4a89-9960-6168ded9ee75-kube-api-access-dvv6v\") pod \"auto-csr-approver-29564350-msdmw\" (UID: \"0ab807b4-57f7-4a89-9960-6168ded9ee75\") " pod="openshift-infra/auto-csr-approver-29564350-msdmw" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.359927 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvv6v\" (UniqueName: \"kubernetes.io/projected/0ab807b4-57f7-4a89-9960-6168ded9ee75-kube-api-access-dvv6v\") pod \"auto-csr-approver-29564350-msdmw\" (UID: \"0ab807b4-57f7-4a89-9960-6168ded9ee75\") " pod="openshift-infra/auto-csr-approver-29564350-msdmw" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.495211 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564350-msdmw" Mar 18 19:10:00 crc kubenswrapper[5008]: I0318 19:10:00.978440 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564350-msdmw"] Mar 18 19:10:00 crc kubenswrapper[5008]: W0318 19:10:00.990057 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ab807b4_57f7_4a89_9960_6168ded9ee75.slice/crio-1c1a7bce4a1eca1f1b8c137fc9d99ab9e2516eca5a6d8f40557a586f017b1ec6 WatchSource:0}: Error finding container 1c1a7bce4a1eca1f1b8c137fc9d99ab9e2516eca5a6d8f40557a586f017b1ec6: Status 404 returned error can't find the container with id 1c1a7bce4a1eca1f1b8c137fc9d99ab9e2516eca5a6d8f40557a586f017b1ec6 Mar 18 19:10:01 crc kubenswrapper[5008]: I0318 19:10:01.658456 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564350-msdmw" event={"ID":"0ab807b4-57f7-4a89-9960-6168ded9ee75","Type":"ContainerStarted","Data":"1c1a7bce4a1eca1f1b8c137fc9d99ab9e2516eca5a6d8f40557a586f017b1ec6"} Mar 18 19:10:02 crc kubenswrapper[5008]: I0318 19:10:02.667425 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564350-msdmw" event={"ID":"0ab807b4-57f7-4a89-9960-6168ded9ee75","Type":"ContainerStarted","Data":"db08d893569f33aa9300f5bd9a940a8f11ad957fca9fc0f2752dea94fb6ccc83"} Mar 18 19:10:02 crc kubenswrapper[5008]: I0318 19:10:02.685127 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564350-msdmw" podStartSLOduration=1.393663306 podStartE2EDuration="2.685106029s" podCreationTimestamp="2026-03-18 19:10:00 +0000 UTC" firstStartedPulling="2026-03-18 19:10:01.002816762 +0000 UTC m=+4057.522289881" lastFinishedPulling="2026-03-18 19:10:02.294259485 +0000 UTC m=+4058.813732604" observedRunningTime="2026-03-18 19:10:02.681398841 +0000 UTC m=+4059.200871920" watchObservedRunningTime="2026-03-18 19:10:02.685106029 +0000 UTC m=+4059.204579118" Mar 18 19:10:03 crc kubenswrapper[5008]: I0318 19:10:03.687727 5008 generic.go:334] "Generic (PLEG): container finished" podID="0ab807b4-57f7-4a89-9960-6168ded9ee75" containerID="db08d893569f33aa9300f5bd9a940a8f11ad957fca9fc0f2752dea94fb6ccc83" exitCode=0 Mar 18 19:10:03 crc kubenswrapper[5008]: I0318 19:10:03.687905 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564350-msdmw" event={"ID":"0ab807b4-57f7-4a89-9960-6168ded9ee75","Type":"ContainerDied","Data":"db08d893569f33aa9300f5bd9a940a8f11ad957fca9fc0f2752dea94fb6ccc83"} Mar 18 19:10:05 crc kubenswrapper[5008]: I0318 19:10:05.016234 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564350-msdmw" Mar 18 19:10:05 crc kubenswrapper[5008]: I0318 19:10:05.212147 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvv6v\" (UniqueName: \"kubernetes.io/projected/0ab807b4-57f7-4a89-9960-6168ded9ee75-kube-api-access-dvv6v\") pod \"0ab807b4-57f7-4a89-9960-6168ded9ee75\" (UID: \"0ab807b4-57f7-4a89-9960-6168ded9ee75\") " Mar 18 19:10:05 crc kubenswrapper[5008]: I0318 19:10:05.615744 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab807b4-57f7-4a89-9960-6168ded9ee75-kube-api-access-dvv6v" (OuterVolumeSpecName: "kube-api-access-dvv6v") pod "0ab807b4-57f7-4a89-9960-6168ded9ee75" (UID: "0ab807b4-57f7-4a89-9960-6168ded9ee75"). InnerVolumeSpecName "kube-api-access-dvv6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:10:05 crc kubenswrapper[5008]: I0318 19:10:05.618213 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvv6v\" (UniqueName: \"kubernetes.io/projected/0ab807b4-57f7-4a89-9960-6168ded9ee75-kube-api-access-dvv6v\") on node \"crc\" DevicePath \"\"" Mar 18 19:10:05 crc kubenswrapper[5008]: I0318 19:10:05.713152 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564350-msdmw" event={"ID":"0ab807b4-57f7-4a89-9960-6168ded9ee75","Type":"ContainerDied","Data":"1c1a7bce4a1eca1f1b8c137fc9d99ab9e2516eca5a6d8f40557a586f017b1ec6"} Mar 18 19:10:05 crc kubenswrapper[5008]: I0318 19:10:05.713708 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c1a7bce4a1eca1f1b8c137fc9d99ab9e2516eca5a6d8f40557a586f017b1ec6" Mar 18 19:10:05 crc kubenswrapper[5008]: I0318 19:10:05.713893 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564350-msdmw" Mar 18 19:10:05 crc kubenswrapper[5008]: I0318 19:10:05.768159 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564344-5vmfj"] Mar 18 19:10:05 crc kubenswrapper[5008]: I0318 19:10:05.775975 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564344-5vmfj"] Mar 18 19:10:06 crc kubenswrapper[5008]: I0318 19:10:06.206038 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71724ff-f0f7-409f-beb4-40eb4adbe13c" path="/var/lib/kubelet/pods/e71724ff-f0f7-409f-beb4-40eb4adbe13c/volumes" Mar 18 19:10:24 crc kubenswrapper[5008]: I0318 19:10:24.460746 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:10:24 crc kubenswrapper[5008]: I0318 19:10:24.461356 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:10:42 crc kubenswrapper[5008]: I0318 19:10:42.847227 5008 scope.go:117] "RemoveContainer" containerID="46c31244dea16e91b912e40d135480273c7ae08199af3736e565ad5f265cf8f4" Mar 18 19:10:54 crc kubenswrapper[5008]: I0318 19:10:54.460112 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:10:54 crc kubenswrapper[5008]: I0318 19:10:54.460883 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:11:24 crc kubenswrapper[5008]: I0318 19:11:24.460946 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:11:24 crc kubenswrapper[5008]: I0318 19:11:24.461725 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:11:24 crc kubenswrapper[5008]: I0318 19:11:24.461816 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 19:11:24 crc kubenswrapper[5008]: I0318 19:11:24.462897 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72a6b748f99aec5a0e48585405946db8a40fd1cec8156208fa6b54edd146499b"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:11:24 crc kubenswrapper[5008]: I0318 19:11:24.463008 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://72a6b748f99aec5a0e48585405946db8a40fd1cec8156208fa6b54edd146499b" gracePeriod=600 Mar 18 19:11:25 crc kubenswrapper[5008]: I0318 19:11:25.422975 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="72a6b748f99aec5a0e48585405946db8a40fd1cec8156208fa6b54edd146499b" exitCode=0 Mar 18 19:11:25 crc kubenswrapper[5008]: I0318 19:11:25.423083 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"72a6b748f99aec5a0e48585405946db8a40fd1cec8156208fa6b54edd146499b"} Mar 18 19:11:25 crc kubenswrapper[5008]: I0318 19:11:25.423669 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86"} Mar 18 19:11:25 crc kubenswrapper[5008]: I0318 19:11:25.423706 5008 scope.go:117] "RemoveContainer" containerID="be0d1e7a04d0f59d2d82661c3eff510529ae1c436b11600c8006d0cc70a6656c" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.164050 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564352-cl8cg"] Mar 18 19:12:00 crc kubenswrapper[5008]: E0318 19:12:00.164878 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab807b4-57f7-4a89-9960-6168ded9ee75" containerName="oc" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.164892 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab807b4-57f7-4a89-9960-6168ded9ee75" containerName="oc" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.165044 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab807b4-57f7-4a89-9960-6168ded9ee75" containerName="oc" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.165506 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564352-cl8cg" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.168328 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.170160 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.172970 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.181600 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564352-cl8cg"] Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.220945 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbld9\" (UniqueName: \"kubernetes.io/projected/7adc401c-a2de-495c-8bd5-a6979e497ead-kube-api-access-jbld9\") pod \"auto-csr-approver-29564352-cl8cg\" (UID: \"7adc401c-a2de-495c-8bd5-a6979e497ead\") " pod="openshift-infra/auto-csr-approver-29564352-cl8cg" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.322353 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbld9\" (UniqueName: \"kubernetes.io/projected/7adc401c-a2de-495c-8bd5-a6979e497ead-kube-api-access-jbld9\") pod \"auto-csr-approver-29564352-cl8cg\" (UID: \"7adc401c-a2de-495c-8bd5-a6979e497ead\") " pod="openshift-infra/auto-csr-approver-29564352-cl8cg" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.351966 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbld9\" (UniqueName: \"kubernetes.io/projected/7adc401c-a2de-495c-8bd5-a6979e497ead-kube-api-access-jbld9\") pod \"auto-csr-approver-29564352-cl8cg\" (UID: \"7adc401c-a2de-495c-8bd5-a6979e497ead\") " pod="openshift-infra/auto-csr-approver-29564352-cl8cg" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.509312 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564352-cl8cg" Mar 18 19:12:00 crc kubenswrapper[5008]: I0318 19:12:00.754106 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564352-cl8cg"] Mar 18 19:12:01 crc kubenswrapper[5008]: I0318 19:12:01.734926 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564352-cl8cg" event={"ID":"7adc401c-a2de-495c-8bd5-a6979e497ead","Type":"ContainerStarted","Data":"c585c0ddd33e6690d3090f7baae037cd74ce94f79f429c29cc368de61f7cb7bc"} Mar 18 19:12:02 crc kubenswrapper[5008]: I0318 19:12:02.749464 5008 generic.go:334] "Generic (PLEG): container finished" podID="7adc401c-a2de-495c-8bd5-a6979e497ead" containerID="aa1ccc47f6467659a5cd35841ee4aa57415399474e80bf0862afdfd514f7d0c6" exitCode=0 Mar 18 19:12:02 crc kubenswrapper[5008]: I0318 19:12:02.749597 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564352-cl8cg" event={"ID":"7adc401c-a2de-495c-8bd5-a6979e497ead","Type":"ContainerDied","Data":"aa1ccc47f6467659a5cd35841ee4aa57415399474e80bf0862afdfd514f7d0c6"} Mar 18 19:12:04 crc kubenswrapper[5008]: I0318 19:12:04.132811 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564352-cl8cg" Mar 18 19:12:04 crc kubenswrapper[5008]: I0318 19:12:04.289698 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbld9\" (UniqueName: \"kubernetes.io/projected/7adc401c-a2de-495c-8bd5-a6979e497ead-kube-api-access-jbld9\") pod \"7adc401c-a2de-495c-8bd5-a6979e497ead\" (UID: \"7adc401c-a2de-495c-8bd5-a6979e497ead\") " Mar 18 19:12:04 crc kubenswrapper[5008]: I0318 19:12:04.294891 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7adc401c-a2de-495c-8bd5-a6979e497ead-kube-api-access-jbld9" (OuterVolumeSpecName: "kube-api-access-jbld9") pod "7adc401c-a2de-495c-8bd5-a6979e497ead" (UID: "7adc401c-a2de-495c-8bd5-a6979e497ead"). InnerVolumeSpecName "kube-api-access-jbld9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:12:04 crc kubenswrapper[5008]: I0318 19:12:04.392588 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbld9\" (UniqueName: \"kubernetes.io/projected/7adc401c-a2de-495c-8bd5-a6979e497ead-kube-api-access-jbld9\") on node \"crc\" DevicePath \"\"" Mar 18 19:12:04 crc kubenswrapper[5008]: I0318 19:12:04.780413 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564352-cl8cg" event={"ID":"7adc401c-a2de-495c-8bd5-a6979e497ead","Type":"ContainerDied","Data":"c585c0ddd33e6690d3090f7baae037cd74ce94f79f429c29cc368de61f7cb7bc"} Mar 18 19:12:04 crc kubenswrapper[5008]: I0318 19:12:04.780493 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c585c0ddd33e6690d3090f7baae037cd74ce94f79f429c29cc368de61f7cb7bc" Mar 18 19:12:04 crc kubenswrapper[5008]: I0318 19:12:04.780513 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564352-cl8cg" Mar 18 19:12:05 crc kubenswrapper[5008]: I0318 19:12:05.232713 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564346-n55l7"] Mar 18 19:12:05 crc kubenswrapper[5008]: I0318 19:12:05.239735 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564346-n55l7"] Mar 18 19:12:06 crc kubenswrapper[5008]: I0318 19:12:06.212036 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a3f922-a281-4b02-be27-354a0a8c20df" path="/var/lib/kubelet/pods/23a3f922-a281-4b02-be27-354a0a8c20df/volumes" Mar 18 19:12:42 crc kubenswrapper[5008]: I0318 19:12:42.934394 5008 scope.go:117] "RemoveContainer" containerID="ebdf5ae40e66d37a3a0c516853396ee0e346b241b68adbe3dc32d4ede30cfbb9" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.418232 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z849b"] Mar 18 19:13:05 crc kubenswrapper[5008]: E0318 19:13:05.420652 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7adc401c-a2de-495c-8bd5-a6979e497ead" containerName="oc" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.420714 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="7adc401c-a2de-495c-8bd5-a6979e497ead" containerName="oc" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.421101 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="7adc401c-a2de-495c-8bd5-a6979e497ead" containerName="oc" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.422890 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.435118 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z849b"] Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.570180 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-utilities\") pod \"certified-operators-z849b\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.570288 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22fm4\" (UniqueName: \"kubernetes.io/projected/fa554fdf-2869-422a-8966-1313367c5eb5-kube-api-access-22fm4\") pod \"certified-operators-z849b\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.570320 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-catalog-content\") pod \"certified-operators-z849b\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.621644 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b9vdm"] Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.623369 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.641408 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9vdm"] Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.675261 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-utilities\") pod \"certified-operators-z849b\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.675323 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22fm4\" (UniqueName: \"kubernetes.io/projected/fa554fdf-2869-422a-8966-1313367c5eb5-kube-api-access-22fm4\") pod \"certified-operators-z849b\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.675345 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-catalog-content\") pod \"certified-operators-z849b\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.675768 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-catalog-content\") pod \"certified-operators-z849b\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.675967 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-utilities\") pod \"certified-operators-z849b\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.706586 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22fm4\" (UniqueName: \"kubernetes.io/projected/fa554fdf-2869-422a-8966-1313367c5eb5-kube-api-access-22fm4\") pod \"certified-operators-z849b\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.776370 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfnmr\" (UniqueName: \"kubernetes.io/projected/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-kube-api-access-zfnmr\") pod \"redhat-operators-b9vdm\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.776459 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-utilities\") pod \"redhat-operators-b9vdm\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.776533 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-catalog-content\") pod \"redhat-operators-b9vdm\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.794340 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.877622 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfnmr\" (UniqueName: \"kubernetes.io/projected/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-kube-api-access-zfnmr\") pod \"redhat-operators-b9vdm\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.877943 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-utilities\") pod \"redhat-operators-b9vdm\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.878528 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-utilities\") pod \"redhat-operators-b9vdm\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.878627 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-catalog-content\") pod \"redhat-operators-b9vdm\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.878911 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-catalog-content\") pod \"redhat-operators-b9vdm\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.902196 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfnmr\" (UniqueName: \"kubernetes.io/projected/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-kube-api-access-zfnmr\") pod \"redhat-operators-b9vdm\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:05 crc kubenswrapper[5008]: I0318 19:13:05.942908 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:06 crc kubenswrapper[5008]: I0318 19:13:06.269140 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z849b"] Mar 18 19:13:06 crc kubenswrapper[5008]: I0318 19:13:06.331731 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z849b" event={"ID":"fa554fdf-2869-422a-8966-1313367c5eb5","Type":"ContainerStarted","Data":"bec2284f93a50ea48e49835b4842c2c928681040aa36369d11a9c80d4351805e"} Mar 18 19:13:06 crc kubenswrapper[5008]: I0318 19:13:06.475246 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9vdm"] Mar 18 19:13:06 crc kubenswrapper[5008]: W0318 19:13:06.522791 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0ba9fa4_5575_436e_8c0d_200bd3127a3c.slice/crio-131433cf5bb3d64bf0a0a1b19183c364f652a46a39f6493e6222a2150c8a4417 WatchSource:0}: Error finding container 131433cf5bb3d64bf0a0a1b19183c364f652a46a39f6493e6222a2150c8a4417: Status 404 returned error can't find the container with id 131433cf5bb3d64bf0a0a1b19183c364f652a46a39f6493e6222a2150c8a4417 Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.339135 5008 generic.go:334] "Generic (PLEG): container finished" podID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerID="12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64" exitCode=0 Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.339416 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9vdm" event={"ID":"d0ba9fa4-5575-436e-8c0d-200bd3127a3c","Type":"ContainerDied","Data":"12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64"} Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.339650 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9vdm" event={"ID":"d0ba9fa4-5575-436e-8c0d-200bd3127a3c","Type":"ContainerStarted","Data":"131433cf5bb3d64bf0a0a1b19183c364f652a46a39f6493e6222a2150c8a4417"} Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.341636 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.341954 5008 generic.go:334] "Generic (PLEG): container finished" podID="fa554fdf-2869-422a-8966-1313367c5eb5" containerID="7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7" exitCode=0 Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.341984 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z849b" event={"ID":"fa554fdf-2869-422a-8966-1313367c5eb5","Type":"ContainerDied","Data":"7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7"} Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.815833 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sjwkb"] Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.818083 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.831028 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sjwkb"] Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.919006 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-catalog-content\") pod \"community-operators-sjwkb\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.919060 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-utilities\") pod \"community-operators-sjwkb\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:07 crc kubenswrapper[5008]: I0318 19:13:07.919240 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d99vw\" (UniqueName: \"kubernetes.io/projected/863715e6-ed8e-4318-b9e4-45024a33f82b-kube-api-access-d99vw\") pod \"community-operators-sjwkb\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:08 crc kubenswrapper[5008]: I0318 19:13:08.021085 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d99vw\" (UniqueName: \"kubernetes.io/projected/863715e6-ed8e-4318-b9e4-45024a33f82b-kube-api-access-d99vw\") pod \"community-operators-sjwkb\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:08 crc kubenswrapper[5008]: I0318 19:13:08.021216 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-catalog-content\") pod \"community-operators-sjwkb\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:08 crc kubenswrapper[5008]: I0318 19:13:08.021250 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-utilities\") pod \"community-operators-sjwkb\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:08 crc kubenswrapper[5008]: I0318 19:13:08.021876 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-utilities\") pod \"community-operators-sjwkb\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:08 crc kubenswrapper[5008]: I0318 19:13:08.022190 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-catalog-content\") pod \"community-operators-sjwkb\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:08 crc kubenswrapper[5008]: I0318 19:13:08.040026 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d99vw\" (UniqueName: \"kubernetes.io/projected/863715e6-ed8e-4318-b9e4-45024a33f82b-kube-api-access-d99vw\") pod \"community-operators-sjwkb\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:08 crc kubenswrapper[5008]: I0318 19:13:08.184951 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:08 crc kubenswrapper[5008]: I0318 19:13:08.364620 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z849b" event={"ID":"fa554fdf-2869-422a-8966-1313367c5eb5","Type":"ContainerStarted","Data":"e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab"} Mar 18 19:13:08 crc kubenswrapper[5008]: I0318 19:13:08.507821 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sjwkb"] Mar 18 19:13:08 crc kubenswrapper[5008]: W0318 19:13:08.513214 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod863715e6_ed8e_4318_b9e4_45024a33f82b.slice/crio-3e18686a161b1652e3c66c466f9fcd559c32801000133dc563dfbba8b6d4185c WatchSource:0}: Error finding container 3e18686a161b1652e3c66c466f9fcd559c32801000133dc563dfbba8b6d4185c: Status 404 returned error can't find the container with id 3e18686a161b1652e3c66c466f9fcd559c32801000133dc563dfbba8b6d4185c Mar 18 19:13:09 crc kubenswrapper[5008]: I0318 19:13:09.378494 5008 generic.go:334] "Generic (PLEG): container finished" podID="fa554fdf-2869-422a-8966-1313367c5eb5" containerID="e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab" exitCode=0 Mar 18 19:13:09 crc kubenswrapper[5008]: I0318 19:13:09.378632 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z849b" event={"ID":"fa554fdf-2869-422a-8966-1313367c5eb5","Type":"ContainerDied","Data":"e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab"} Mar 18 19:13:09 crc kubenswrapper[5008]: I0318 19:13:09.383901 5008 generic.go:334] "Generic (PLEG): container finished" podID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerID="685995a2c62a50996ee0bf044227ba2185dc6ee0591a91a026da2cbd36b0c83c" exitCode=0 Mar 18 19:13:09 crc kubenswrapper[5008]: I0318 19:13:09.384192 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjwkb" event={"ID":"863715e6-ed8e-4318-b9e4-45024a33f82b","Type":"ContainerDied","Data":"685995a2c62a50996ee0bf044227ba2185dc6ee0591a91a026da2cbd36b0c83c"} Mar 18 19:13:09 crc kubenswrapper[5008]: I0318 19:13:09.384279 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjwkb" event={"ID":"863715e6-ed8e-4318-b9e4-45024a33f82b","Type":"ContainerStarted","Data":"3e18686a161b1652e3c66c466f9fcd559c32801000133dc563dfbba8b6d4185c"} Mar 18 19:13:09 crc kubenswrapper[5008]: I0318 19:13:09.395423 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9vdm" event={"ID":"d0ba9fa4-5575-436e-8c0d-200bd3127a3c","Type":"ContainerStarted","Data":"ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a"} Mar 18 19:13:10 crc kubenswrapper[5008]: I0318 19:13:10.408164 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z849b" event={"ID":"fa554fdf-2869-422a-8966-1313367c5eb5","Type":"ContainerStarted","Data":"d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b"} Mar 18 19:13:10 crc kubenswrapper[5008]: I0318 19:13:10.413501 5008 generic.go:334] "Generic (PLEG): container finished" podID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerID="ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a" exitCode=0 Mar 18 19:13:10 crc kubenswrapper[5008]: I0318 19:13:10.413569 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9vdm" event={"ID":"d0ba9fa4-5575-436e-8c0d-200bd3127a3c","Type":"ContainerDied","Data":"ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a"} Mar 18 19:13:10 crc kubenswrapper[5008]: I0318 19:13:10.437378 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z849b" podStartSLOduration=2.990152146 podStartE2EDuration="5.437340928s" podCreationTimestamp="2026-03-18 19:13:05 +0000 UTC" firstStartedPulling="2026-03-18 19:13:07.343442884 +0000 UTC m=+4243.862915973" lastFinishedPulling="2026-03-18 19:13:09.790631666 +0000 UTC m=+4246.310104755" observedRunningTime="2026-03-18 19:13:10.426919715 +0000 UTC m=+4246.946392804" watchObservedRunningTime="2026-03-18 19:13:10.437340928 +0000 UTC m=+4246.956814047" Mar 18 19:13:11 crc kubenswrapper[5008]: I0318 19:13:11.420176 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9vdm" event={"ID":"d0ba9fa4-5575-436e-8c0d-200bd3127a3c","Type":"ContainerStarted","Data":"8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18"} Mar 18 19:13:11 crc kubenswrapper[5008]: I0318 19:13:11.422153 5008 generic.go:334] "Generic (PLEG): container finished" podID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerID="cca251f8b6da7f6e3de74d625f968ce4b98e4dcba88e518dd8d63c8b3d923d3e" exitCode=0 Mar 18 19:13:11 crc kubenswrapper[5008]: I0318 19:13:11.422237 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjwkb" event={"ID":"863715e6-ed8e-4318-b9e4-45024a33f82b","Type":"ContainerDied","Data":"cca251f8b6da7f6e3de74d625f968ce4b98e4dcba88e518dd8d63c8b3d923d3e"} Mar 18 19:13:11 crc kubenswrapper[5008]: I0318 19:13:11.447127 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b9vdm" podStartSLOduration=2.976119029 podStartE2EDuration="6.447107625s" podCreationTimestamp="2026-03-18 19:13:05 +0000 UTC" firstStartedPulling="2026-03-18 19:13:07.34138143 +0000 UTC m=+4243.860854499" lastFinishedPulling="2026-03-18 19:13:10.812370006 +0000 UTC m=+4247.331843095" observedRunningTime="2026-03-18 19:13:11.44084038 +0000 UTC m=+4247.960313459" watchObservedRunningTime="2026-03-18 19:13:11.447107625 +0000 UTC m=+4247.966580704" Mar 18 19:13:12 crc kubenswrapper[5008]: I0318 19:13:12.431691 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjwkb" event={"ID":"863715e6-ed8e-4318-b9e4-45024a33f82b","Type":"ContainerStarted","Data":"c1b50558808c855ae38edb72313032e0e82463d063212278ed8045802b52fa9d"} Mar 18 19:13:12 crc kubenswrapper[5008]: I0318 19:13:12.457856 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sjwkb" podStartSLOduration=3.019067675 podStartE2EDuration="5.457831456s" podCreationTimestamp="2026-03-18 19:13:07 +0000 UTC" firstStartedPulling="2026-03-18 19:13:09.386113104 +0000 UTC m=+4245.905586223" lastFinishedPulling="2026-03-18 19:13:11.824876925 +0000 UTC m=+4248.344350004" observedRunningTime="2026-03-18 19:13:12.450247707 +0000 UTC m=+4248.969720806" watchObservedRunningTime="2026-03-18 19:13:12.457831456 +0000 UTC m=+4248.977304545" Mar 18 19:13:15 crc kubenswrapper[5008]: I0318 19:13:15.795305 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:15 crc kubenswrapper[5008]: I0318 19:13:15.795676 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:15 crc kubenswrapper[5008]: I0318 19:13:15.836419 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:15 crc kubenswrapper[5008]: I0318 19:13:15.944598 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:15 crc kubenswrapper[5008]: I0318 19:13:15.944651 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:16 crc kubenswrapper[5008]: I0318 19:13:16.532427 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:16 crc kubenswrapper[5008]: I0318 19:13:16.998412 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b9vdm" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerName="registry-server" probeResult="failure" output=< Mar 18 19:13:16 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 19:13:16 crc kubenswrapper[5008]: > Mar 18 19:13:17 crc kubenswrapper[5008]: I0318 19:13:17.411147 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z849b"] Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.185579 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.185692 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.231052 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.482968 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z849b" podUID="fa554fdf-2869-422a-8966-1313367c5eb5" containerName="registry-server" containerID="cri-o://d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b" gracePeriod=2 Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.528078 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.890910 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.987334 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-catalog-content\") pod \"fa554fdf-2869-422a-8966-1313367c5eb5\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.987542 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22fm4\" (UniqueName: \"kubernetes.io/projected/fa554fdf-2869-422a-8966-1313367c5eb5-kube-api-access-22fm4\") pod \"fa554fdf-2869-422a-8966-1313367c5eb5\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.987651 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-utilities\") pod \"fa554fdf-2869-422a-8966-1313367c5eb5\" (UID: \"fa554fdf-2869-422a-8966-1313367c5eb5\") " Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.988413 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-utilities" (OuterVolumeSpecName: "utilities") pod "fa554fdf-2869-422a-8966-1313367c5eb5" (UID: "fa554fdf-2869-422a-8966-1313367c5eb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:13:18 crc kubenswrapper[5008]: I0318 19:13:18.993421 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa554fdf-2869-422a-8966-1313367c5eb5-kube-api-access-22fm4" (OuterVolumeSpecName: "kube-api-access-22fm4") pod "fa554fdf-2869-422a-8966-1313367c5eb5" (UID: "fa554fdf-2869-422a-8966-1313367c5eb5"). InnerVolumeSpecName "kube-api-access-22fm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.090044 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22fm4\" (UniqueName: \"kubernetes.io/projected/fa554fdf-2869-422a-8966-1313367c5eb5-kube-api-access-22fm4\") on node \"crc\" DevicePath \"\"" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.090085 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.492936 5008 generic.go:334] "Generic (PLEG): container finished" podID="fa554fdf-2869-422a-8966-1313367c5eb5" containerID="d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b" exitCode=0 Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.492989 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z849b" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.493046 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z849b" event={"ID":"fa554fdf-2869-422a-8966-1313367c5eb5","Type":"ContainerDied","Data":"d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b"} Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.493080 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z849b" event={"ID":"fa554fdf-2869-422a-8966-1313367c5eb5","Type":"ContainerDied","Data":"bec2284f93a50ea48e49835b4842c2c928681040aa36369d11a9c80d4351805e"} Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.493101 5008 scope.go:117] "RemoveContainer" containerID="d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.516653 5008 scope.go:117] "RemoveContainer" containerID="e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.544137 5008 scope.go:117] "RemoveContainer" containerID="7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.572835 5008 scope.go:117] "RemoveContainer" containerID="d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b" Mar 18 19:13:19 crc kubenswrapper[5008]: E0318 19:13:19.573482 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b\": container with ID starting with d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b not found: ID does not exist" containerID="d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.573526 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b"} err="failed to get container status \"d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b\": rpc error: code = NotFound desc = could not find container \"d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b\": container with ID starting with d7c4a6cdbb762106dded7d7e40218dcfc478c6844593afc30f9a0e75519fd76b not found: ID does not exist" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.573547 5008 scope.go:117] "RemoveContainer" containerID="e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab" Mar 18 19:13:19 crc kubenswrapper[5008]: E0318 19:13:19.574386 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab\": container with ID starting with e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab not found: ID does not exist" containerID="e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.574452 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab"} err="failed to get container status \"e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab\": rpc error: code = NotFound desc = could not find container \"e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab\": container with ID starting with e4cadbff3280920d1240b26237130caa2f44bf3f611674e08ebc4906153eabab not found: ID does not exist" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.574499 5008 scope.go:117] "RemoveContainer" containerID="7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7" Mar 18 19:13:19 crc kubenswrapper[5008]: E0318 19:13:19.575193 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7\": container with ID starting with 7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7 not found: ID does not exist" containerID="7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7" Mar 18 19:13:19 crc kubenswrapper[5008]: I0318 19:13:19.575237 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7"} err="failed to get container status \"7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7\": rpc error: code = NotFound desc = could not find container \"7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7\": container with ID starting with 7c6815ecf016699e0e54ba9e48f47d21d0c55d9833991d9ae3e33cac444791a7 not found: ID does not exist" Mar 18 19:13:20 crc kubenswrapper[5008]: I0318 19:13:20.561721 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa554fdf-2869-422a-8966-1313367c5eb5" (UID: "fa554fdf-2869-422a-8966-1313367c5eb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:13:20 crc kubenswrapper[5008]: I0318 19:13:20.606112 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sjwkb"] Mar 18 19:13:20 crc kubenswrapper[5008]: I0318 19:13:20.606321 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sjwkb" podUID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerName="registry-server" containerID="cri-o://c1b50558808c855ae38edb72313032e0e82463d063212278ed8045802b52fa9d" gracePeriod=2 Mar 18 19:13:20 crc kubenswrapper[5008]: I0318 19:13:20.611019 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa554fdf-2869-422a-8966-1313367c5eb5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:13:20 crc kubenswrapper[5008]: I0318 19:13:20.733318 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z849b"] Mar 18 19:13:20 crc kubenswrapper[5008]: I0318 19:13:20.742338 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z849b"] Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.514844 5008 generic.go:334] "Generic (PLEG): container finished" podID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerID="c1b50558808c855ae38edb72313032e0e82463d063212278ed8045802b52fa9d" exitCode=0 Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.514893 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjwkb" event={"ID":"863715e6-ed8e-4318-b9e4-45024a33f82b","Type":"ContainerDied","Data":"c1b50558808c855ae38edb72313032e0e82463d063212278ed8045802b52fa9d"} Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.604209 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.724951 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d99vw\" (UniqueName: \"kubernetes.io/projected/863715e6-ed8e-4318-b9e4-45024a33f82b-kube-api-access-d99vw\") pod \"863715e6-ed8e-4318-b9e4-45024a33f82b\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.725000 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-catalog-content\") pod \"863715e6-ed8e-4318-b9e4-45024a33f82b\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.725031 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-utilities\") pod \"863715e6-ed8e-4318-b9e4-45024a33f82b\" (UID: \"863715e6-ed8e-4318-b9e4-45024a33f82b\") " Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.726166 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-utilities" (OuterVolumeSpecName: "utilities") pod "863715e6-ed8e-4318-b9e4-45024a33f82b" (UID: "863715e6-ed8e-4318-b9e4-45024a33f82b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.732917 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863715e6-ed8e-4318-b9e4-45024a33f82b-kube-api-access-d99vw" (OuterVolumeSpecName: "kube-api-access-d99vw") pod "863715e6-ed8e-4318-b9e4-45024a33f82b" (UID: "863715e6-ed8e-4318-b9e4-45024a33f82b"). InnerVolumeSpecName "kube-api-access-d99vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.785747 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "863715e6-ed8e-4318-b9e4-45024a33f82b" (UID: "863715e6-ed8e-4318-b9e4-45024a33f82b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.827164 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d99vw\" (UniqueName: \"kubernetes.io/projected/863715e6-ed8e-4318-b9e4-45024a33f82b-kube-api-access-d99vw\") on node \"crc\" DevicePath \"\"" Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.827213 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:13:21 crc kubenswrapper[5008]: I0318 19:13:21.827227 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863715e6-ed8e-4318-b9e4-45024a33f82b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:13:22 crc kubenswrapper[5008]: I0318 19:13:22.214711 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa554fdf-2869-422a-8966-1313367c5eb5" path="/var/lib/kubelet/pods/fa554fdf-2869-422a-8966-1313367c5eb5/volumes" Mar 18 19:13:22 crc kubenswrapper[5008]: I0318 19:13:22.528448 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjwkb" event={"ID":"863715e6-ed8e-4318-b9e4-45024a33f82b","Type":"ContainerDied","Data":"3e18686a161b1652e3c66c466f9fcd559c32801000133dc563dfbba8b6d4185c"} Mar 18 19:13:22 crc kubenswrapper[5008]: I0318 19:13:22.528678 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjwkb" Mar 18 19:13:22 crc kubenswrapper[5008]: I0318 19:13:22.529218 5008 scope.go:117] "RemoveContainer" containerID="c1b50558808c855ae38edb72313032e0e82463d063212278ed8045802b52fa9d" Mar 18 19:13:22 crc kubenswrapper[5008]: I0318 19:13:22.557292 5008 scope.go:117] "RemoveContainer" containerID="cca251f8b6da7f6e3de74d625f968ce4b98e4dcba88e518dd8d63c8b3d923d3e" Mar 18 19:13:22 crc kubenswrapper[5008]: I0318 19:13:22.560018 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sjwkb"] Mar 18 19:13:22 crc kubenswrapper[5008]: I0318 19:13:22.570864 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sjwkb"] Mar 18 19:13:22 crc kubenswrapper[5008]: I0318 19:13:22.589006 5008 scope.go:117] "RemoveContainer" containerID="685995a2c62a50996ee0bf044227ba2185dc6ee0591a91a026da2cbd36b0c83c" Mar 18 19:13:24 crc kubenswrapper[5008]: I0318 19:13:24.213909 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="863715e6-ed8e-4318-b9e4-45024a33f82b" path="/var/lib/kubelet/pods/863715e6-ed8e-4318-b9e4-45024a33f82b/volumes" Mar 18 19:13:24 crc kubenswrapper[5008]: I0318 19:13:24.460859 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:13:24 crc kubenswrapper[5008]: I0318 19:13:24.460946 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:13:26 crc kubenswrapper[5008]: I0318 19:13:26.027378 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:26 crc kubenswrapper[5008]: I0318 19:13:26.109521 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:26 crc kubenswrapper[5008]: I0318 19:13:26.287282 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9vdm"] Mar 18 19:13:27 crc kubenswrapper[5008]: I0318 19:13:27.579902 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b9vdm" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerName="registry-server" containerID="cri-o://8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18" gracePeriod=2 Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.103950 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.236022 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-utilities\") pod \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.236108 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-catalog-content\") pod \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.236260 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfnmr\" (UniqueName: \"kubernetes.io/projected/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-kube-api-access-zfnmr\") pod \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\" (UID: \"d0ba9fa4-5575-436e-8c0d-200bd3127a3c\") " Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.237277 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-utilities" (OuterVolumeSpecName: "utilities") pod "d0ba9fa4-5575-436e-8c0d-200bd3127a3c" (UID: "d0ba9fa4-5575-436e-8c0d-200bd3127a3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.244655 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-kube-api-access-zfnmr" (OuterVolumeSpecName: "kube-api-access-zfnmr") pod "d0ba9fa4-5575-436e-8c0d-200bd3127a3c" (UID: "d0ba9fa4-5575-436e-8c0d-200bd3127a3c"). InnerVolumeSpecName "kube-api-access-zfnmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.339255 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfnmr\" (UniqueName: \"kubernetes.io/projected/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-kube-api-access-zfnmr\") on node \"crc\" DevicePath \"\"" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.339341 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.394424 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0ba9fa4-5575-436e-8c0d-200bd3127a3c" (UID: "d0ba9fa4-5575-436e-8c0d-200bd3127a3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.441170 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0ba9fa4-5575-436e-8c0d-200bd3127a3c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.590447 5008 generic.go:334] "Generic (PLEG): container finished" podID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerID="8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18" exitCode=0 Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.590528 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9vdm" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.590527 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9vdm" event={"ID":"d0ba9fa4-5575-436e-8c0d-200bd3127a3c","Type":"ContainerDied","Data":"8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18"} Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.592048 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9vdm" event={"ID":"d0ba9fa4-5575-436e-8c0d-200bd3127a3c","Type":"ContainerDied","Data":"131433cf5bb3d64bf0a0a1b19183c364f652a46a39f6493e6222a2150c8a4417"} Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.592075 5008 scope.go:117] "RemoveContainer" containerID="8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.611205 5008 scope.go:117] "RemoveContainer" containerID="ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.628481 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9vdm"] Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.634539 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b9vdm"] Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.650744 5008 scope.go:117] "RemoveContainer" containerID="12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.682153 5008 scope.go:117] "RemoveContainer" containerID="8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18" Mar 18 19:13:28 crc kubenswrapper[5008]: E0318 19:13:28.682644 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18\": container with ID starting with 8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18 not found: ID does not exist" containerID="8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.682687 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18"} err="failed to get container status \"8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18\": rpc error: code = NotFound desc = could not find container \"8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18\": container with ID starting with 8c97aa513d980bad1574fdc9fac0bfd7f72d3243b2f54c02c9f06fa017091b18 not found: ID does not exist" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.682715 5008 scope.go:117] "RemoveContainer" containerID="ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a" Mar 18 19:13:28 crc kubenswrapper[5008]: E0318 19:13:28.683155 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a\": container with ID starting with ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a not found: ID does not exist" containerID="ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.683198 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a"} err="failed to get container status \"ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a\": rpc error: code = NotFound desc = could not find container \"ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a\": container with ID starting with ec630a106d2110441adf2f53e5ebcdf59787c00b6ee3ea3d865e3b2fcf5d702a not found: ID does not exist" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.683260 5008 scope.go:117] "RemoveContainer" containerID="12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64" Mar 18 19:13:28 crc kubenswrapper[5008]: E0318 19:13:28.683596 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64\": container with ID starting with 12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64 not found: ID does not exist" containerID="12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64" Mar 18 19:13:28 crc kubenswrapper[5008]: I0318 19:13:28.683658 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64"} err="failed to get container status \"12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64\": rpc error: code = NotFound desc = could not find container \"12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64\": container with ID starting with 12bb98f25a6b70381436c2cc205f1a3154228fb05f63ad49bbf61d0098abcf64 not found: ID does not exist" Mar 18 19:13:30 crc kubenswrapper[5008]: I0318 19:13:30.210839 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" path="/var/lib/kubelet/pods/d0ba9fa4-5575-436e-8c0d-200bd3127a3c/volumes" Mar 18 19:13:54 crc kubenswrapper[5008]: I0318 19:13:54.459959 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:13:54 crc kubenswrapper[5008]: I0318 19:13:54.460804 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.166925 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564354-rwscz"] Mar 18 19:14:00 crc kubenswrapper[5008]: E0318 19:14:00.167790 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerName="extract-content" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.167805 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerName="extract-content" Mar 18 19:14:00 crc kubenswrapper[5008]: E0318 19:14:00.167829 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerName="extract-content" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.167837 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerName="extract-content" Mar 18 19:14:00 crc kubenswrapper[5008]: E0318 19:14:00.167850 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerName="registry-server" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.167858 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerName="registry-server" Mar 18 19:14:00 crc kubenswrapper[5008]: E0318 19:14:00.167872 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerName="extract-utilities" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.167880 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerName="extract-utilities" Mar 18 19:14:00 crc kubenswrapper[5008]: E0318 19:14:00.167893 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerName="registry-server" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.167900 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerName="registry-server" Mar 18 19:14:00 crc kubenswrapper[5008]: E0318 19:14:00.167913 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerName="extract-utilities" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.167920 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerName="extract-utilities" Mar 18 19:14:00 crc kubenswrapper[5008]: E0318 19:14:00.167940 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa554fdf-2869-422a-8966-1313367c5eb5" containerName="extract-content" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.167947 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa554fdf-2869-422a-8966-1313367c5eb5" containerName="extract-content" Mar 18 19:14:00 crc kubenswrapper[5008]: E0318 19:14:00.167957 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa554fdf-2869-422a-8966-1313367c5eb5" containerName="registry-server" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.167963 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa554fdf-2869-422a-8966-1313367c5eb5" containerName="registry-server" Mar 18 19:14:00 crc kubenswrapper[5008]: E0318 19:14:00.167976 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa554fdf-2869-422a-8966-1313367c5eb5" containerName="extract-utilities" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.167984 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa554fdf-2869-422a-8966-1313367c5eb5" containerName="extract-utilities" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.168136 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="863715e6-ed8e-4318-b9e4-45024a33f82b" containerName="registry-server" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.168156 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0ba9fa4-5575-436e-8c0d-200bd3127a3c" containerName="registry-server" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.168172 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa554fdf-2869-422a-8966-1313367c5eb5" containerName="registry-server" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.168756 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564354-rwscz" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.171688 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.172008 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.172251 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.191584 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564354-rwscz"] Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.263742 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnqps\" (UniqueName: \"kubernetes.io/projected/6f799ff0-38de-4d07-b0f2-92a945600a96-kube-api-access-hnqps\") pod \"auto-csr-approver-29564354-rwscz\" (UID: \"6f799ff0-38de-4d07-b0f2-92a945600a96\") " pod="openshift-infra/auto-csr-approver-29564354-rwscz" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.365833 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnqps\" (UniqueName: \"kubernetes.io/projected/6f799ff0-38de-4d07-b0f2-92a945600a96-kube-api-access-hnqps\") pod \"auto-csr-approver-29564354-rwscz\" (UID: \"6f799ff0-38de-4d07-b0f2-92a945600a96\") " pod="openshift-infra/auto-csr-approver-29564354-rwscz" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.394503 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnqps\" (UniqueName: \"kubernetes.io/projected/6f799ff0-38de-4d07-b0f2-92a945600a96-kube-api-access-hnqps\") pod \"auto-csr-approver-29564354-rwscz\" (UID: \"6f799ff0-38de-4d07-b0f2-92a945600a96\") " pod="openshift-infra/auto-csr-approver-29564354-rwscz" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.495545 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564354-rwscz" Mar 18 19:14:00 crc kubenswrapper[5008]: I0318 19:14:00.955410 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564354-rwscz"] Mar 18 19:14:01 crc kubenswrapper[5008]: I0318 19:14:01.887325 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564354-rwscz" event={"ID":"6f799ff0-38de-4d07-b0f2-92a945600a96","Type":"ContainerStarted","Data":"df708aeab46c0e1ecd1a454e4e426868552521c60a07a3963ee32c3854a1f9b9"} Mar 18 19:14:02 crc kubenswrapper[5008]: I0318 19:14:02.898805 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564354-rwscz" event={"ID":"6f799ff0-38de-4d07-b0f2-92a945600a96","Type":"ContainerStarted","Data":"3748860a1caf8ecf49680e80e5b559635bb3c20b2b3e62f55a39b931b99db9ee"} Mar 18 19:14:02 crc kubenswrapper[5008]: I0318 19:14:02.922866 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564354-rwscz" podStartSLOduration=1.4342400290000001 podStartE2EDuration="2.922843569s" podCreationTimestamp="2026-03-18 19:14:00 +0000 UTC" firstStartedPulling="2026-03-18 19:14:00.967575175 +0000 UTC m=+4297.487048254" lastFinishedPulling="2026-03-18 19:14:02.456178715 +0000 UTC m=+4298.975651794" observedRunningTime="2026-03-18 19:14:02.913076883 +0000 UTC m=+4299.432550002" watchObservedRunningTime="2026-03-18 19:14:02.922843569 +0000 UTC m=+4299.442316658" Mar 18 19:14:03 crc kubenswrapper[5008]: I0318 19:14:03.912804 5008 generic.go:334] "Generic (PLEG): container finished" podID="6f799ff0-38de-4d07-b0f2-92a945600a96" containerID="3748860a1caf8ecf49680e80e5b559635bb3c20b2b3e62f55a39b931b99db9ee" exitCode=0 Mar 18 19:14:03 crc kubenswrapper[5008]: I0318 19:14:03.912976 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564354-rwscz" event={"ID":"6f799ff0-38de-4d07-b0f2-92a945600a96","Type":"ContainerDied","Data":"3748860a1caf8ecf49680e80e5b559635bb3c20b2b3e62f55a39b931b99db9ee"} Mar 18 19:14:05 crc kubenswrapper[5008]: I0318 19:14:05.271282 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564354-rwscz" Mar 18 19:14:05 crc kubenswrapper[5008]: I0318 19:14:05.352028 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnqps\" (UniqueName: \"kubernetes.io/projected/6f799ff0-38de-4d07-b0f2-92a945600a96-kube-api-access-hnqps\") pod \"6f799ff0-38de-4d07-b0f2-92a945600a96\" (UID: \"6f799ff0-38de-4d07-b0f2-92a945600a96\") " Mar 18 19:14:05 crc kubenswrapper[5008]: I0318 19:14:05.357124 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f799ff0-38de-4d07-b0f2-92a945600a96-kube-api-access-hnqps" (OuterVolumeSpecName: "kube-api-access-hnqps") pod "6f799ff0-38de-4d07-b0f2-92a945600a96" (UID: "6f799ff0-38de-4d07-b0f2-92a945600a96"). InnerVolumeSpecName "kube-api-access-hnqps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:14:05 crc kubenswrapper[5008]: I0318 19:14:05.453942 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnqps\" (UniqueName: \"kubernetes.io/projected/6f799ff0-38de-4d07-b0f2-92a945600a96-kube-api-access-hnqps\") on node \"crc\" DevicePath \"\"" Mar 18 19:14:05 crc kubenswrapper[5008]: I0318 19:14:05.936437 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564354-rwscz" event={"ID":"6f799ff0-38de-4d07-b0f2-92a945600a96","Type":"ContainerDied","Data":"df708aeab46c0e1ecd1a454e4e426868552521c60a07a3963ee32c3854a1f9b9"} Mar 18 19:14:05 crc kubenswrapper[5008]: I0318 19:14:05.936496 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df708aeab46c0e1ecd1a454e4e426868552521c60a07a3963ee32c3854a1f9b9" Mar 18 19:14:05 crc kubenswrapper[5008]: I0318 19:14:05.936604 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564354-rwscz" Mar 18 19:14:06 crc kubenswrapper[5008]: I0318 19:14:06.025302 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564348-ljs2z"] Mar 18 19:14:06 crc kubenswrapper[5008]: I0318 19:14:06.033236 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564348-ljs2z"] Mar 18 19:14:06 crc kubenswrapper[5008]: I0318 19:14:06.211219 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c14cdb-1d5c-4fab-b0f0-72c992899c8e" path="/var/lib/kubelet/pods/30c14cdb-1d5c-4fab-b0f0-72c992899c8e/volumes" Mar 18 19:14:24 crc kubenswrapper[5008]: I0318 19:14:24.460381 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:14:24 crc kubenswrapper[5008]: I0318 19:14:24.461772 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:14:24 crc kubenswrapper[5008]: I0318 19:14:24.461838 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 19:14:24 crc kubenswrapper[5008]: I0318 19:14:24.462401 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:14:24 crc kubenswrapper[5008]: I0318 19:14:24.462450 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" gracePeriod=600 Mar 18 19:14:24 crc kubenswrapper[5008]: E0318 19:14:24.604428 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:14:25 crc kubenswrapper[5008]: I0318 19:14:25.153623 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86"} Mar 18 19:14:25 crc kubenswrapper[5008]: I0318 19:14:25.153631 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" exitCode=0 Mar 18 19:14:25 crc kubenswrapper[5008]: I0318 19:14:25.153686 5008 scope.go:117] "RemoveContainer" containerID="72a6b748f99aec5a0e48585405946db8a40fd1cec8156208fa6b54edd146499b" Mar 18 19:14:25 crc kubenswrapper[5008]: I0318 19:14:25.154423 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:14:25 crc kubenswrapper[5008]: E0318 19:14:25.154908 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:14:37 crc kubenswrapper[5008]: I0318 19:14:37.198711 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:14:37 crc kubenswrapper[5008]: E0318 19:14:37.201644 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:14:43 crc kubenswrapper[5008]: I0318 19:14:43.093803 5008 scope.go:117] "RemoveContainer" containerID="1d3432ee28fc8cbfeae7ae631489b862ffdf795aacc107244206920842bf193e" Mar 18 19:14:52 crc kubenswrapper[5008]: I0318 19:14:52.199036 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:14:52 crc kubenswrapper[5008]: E0318 19:14:52.200349 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.168585 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv"] Mar 18 19:15:00 crc kubenswrapper[5008]: E0318 19:15:00.170084 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f799ff0-38de-4d07-b0f2-92a945600a96" containerName="oc" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.170115 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f799ff0-38de-4d07-b0f2-92a945600a96" containerName="oc" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.170854 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f799ff0-38de-4d07-b0f2-92a945600a96" containerName="oc" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.171730 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.175678 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.187661 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv"] Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.187938 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.295481 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822b9f58-1c50-413a-a8bc-2c2a464db6a4-secret-volume\") pod \"collect-profiles-29564355-bn2fv\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.295537 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtz22\" (UniqueName: \"kubernetes.io/projected/822b9f58-1c50-413a-a8bc-2c2a464db6a4-kube-api-access-mtz22\") pod \"collect-profiles-29564355-bn2fv\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.295626 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822b9f58-1c50-413a-a8bc-2c2a464db6a4-config-volume\") pod \"collect-profiles-29564355-bn2fv\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.396759 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822b9f58-1c50-413a-a8bc-2c2a464db6a4-secret-volume\") pod \"collect-profiles-29564355-bn2fv\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.396813 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtz22\" (UniqueName: \"kubernetes.io/projected/822b9f58-1c50-413a-a8bc-2c2a464db6a4-kube-api-access-mtz22\") pod \"collect-profiles-29564355-bn2fv\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.396840 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822b9f58-1c50-413a-a8bc-2c2a464db6a4-config-volume\") pod \"collect-profiles-29564355-bn2fv\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.401157 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822b9f58-1c50-413a-a8bc-2c2a464db6a4-config-volume\") pod \"collect-profiles-29564355-bn2fv\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.419376 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822b9f58-1c50-413a-a8bc-2c2a464db6a4-secret-volume\") pod \"collect-profiles-29564355-bn2fv\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.431730 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtz22\" (UniqueName: \"kubernetes.io/projected/822b9f58-1c50-413a-a8bc-2c2a464db6a4-kube-api-access-mtz22\") pod \"collect-profiles-29564355-bn2fv\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:00 crc kubenswrapper[5008]: I0318 19:15:00.504862 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:01 crc kubenswrapper[5008]: I0318 19:15:01.001913 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv"] Mar 18 19:15:01 crc kubenswrapper[5008]: I0318 19:15:01.465756 5008 generic.go:334] "Generic (PLEG): container finished" podID="822b9f58-1c50-413a-a8bc-2c2a464db6a4" containerID="270b0b582040d93b73b00bb224ecdd627ce0306e98575bc297b8b90da1a8b0da" exitCode=0 Mar 18 19:15:01 crc kubenswrapper[5008]: I0318 19:15:01.465848 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" event={"ID":"822b9f58-1c50-413a-a8bc-2c2a464db6a4","Type":"ContainerDied","Data":"270b0b582040d93b73b00bb224ecdd627ce0306e98575bc297b8b90da1a8b0da"} Mar 18 19:15:01 crc kubenswrapper[5008]: I0318 19:15:01.466129 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" event={"ID":"822b9f58-1c50-413a-a8bc-2c2a464db6a4","Type":"ContainerStarted","Data":"e13151a999554d0c8a8f5cf366e5d21ffa438163acbea1f5a488b126fb2487f4"} Mar 18 19:15:02 crc kubenswrapper[5008]: I0318 19:15:02.821459 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:02 crc kubenswrapper[5008]: I0318 19:15:02.931970 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822b9f58-1c50-413a-a8bc-2c2a464db6a4-config-volume\") pod \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " Mar 18 19:15:02 crc kubenswrapper[5008]: I0318 19:15:02.932083 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822b9f58-1c50-413a-a8bc-2c2a464db6a4-secret-volume\") pod \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " Mar 18 19:15:02 crc kubenswrapper[5008]: I0318 19:15:02.932311 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtz22\" (UniqueName: \"kubernetes.io/projected/822b9f58-1c50-413a-a8bc-2c2a464db6a4-kube-api-access-mtz22\") pod \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\" (UID: \"822b9f58-1c50-413a-a8bc-2c2a464db6a4\") " Mar 18 19:15:02 crc kubenswrapper[5008]: I0318 19:15:02.932632 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/822b9f58-1c50-413a-a8bc-2c2a464db6a4-config-volume" (OuterVolumeSpecName: "config-volume") pod "822b9f58-1c50-413a-a8bc-2c2a464db6a4" (UID: "822b9f58-1c50-413a-a8bc-2c2a464db6a4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:15:02 crc kubenswrapper[5008]: I0318 19:15:02.932712 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822b9f58-1c50-413a-a8bc-2c2a464db6a4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:15:02 crc kubenswrapper[5008]: I0318 19:15:02.941784 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822b9f58-1c50-413a-a8bc-2c2a464db6a4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "822b9f58-1c50-413a-a8bc-2c2a464db6a4" (UID: "822b9f58-1c50-413a-a8bc-2c2a464db6a4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:15:02 crc kubenswrapper[5008]: I0318 19:15:02.943816 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822b9f58-1c50-413a-a8bc-2c2a464db6a4-kube-api-access-mtz22" (OuterVolumeSpecName: "kube-api-access-mtz22") pod "822b9f58-1c50-413a-a8bc-2c2a464db6a4" (UID: "822b9f58-1c50-413a-a8bc-2c2a464db6a4"). InnerVolumeSpecName "kube-api-access-mtz22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:15:03 crc kubenswrapper[5008]: I0318 19:15:03.034369 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtz22\" (UniqueName: \"kubernetes.io/projected/822b9f58-1c50-413a-a8bc-2c2a464db6a4-kube-api-access-mtz22\") on node \"crc\" DevicePath \"\"" Mar 18 19:15:03 crc kubenswrapper[5008]: I0318 19:15:03.034423 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822b9f58-1c50-413a-a8bc-2c2a464db6a4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:15:03 crc kubenswrapper[5008]: I0318 19:15:03.487172 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" event={"ID":"822b9f58-1c50-413a-a8bc-2c2a464db6a4","Type":"ContainerDied","Data":"e13151a999554d0c8a8f5cf366e5d21ffa438163acbea1f5a488b126fb2487f4"} Mar 18 19:15:03 crc kubenswrapper[5008]: I0318 19:15:03.487268 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e13151a999554d0c8a8f5cf366e5d21ffa438163acbea1f5a488b126fb2487f4" Mar 18 19:15:03 crc kubenswrapper[5008]: I0318 19:15:03.487393 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564355-bn2fv" Mar 18 19:15:03 crc kubenswrapper[5008]: I0318 19:15:03.916978 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5"] Mar 18 19:15:03 crc kubenswrapper[5008]: I0318 19:15:03.922235 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564310-fg5t5"] Mar 18 19:15:04 crc kubenswrapper[5008]: I0318 19:15:04.215638 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cf23e92-a320-42c0-a05e-fe7aa2ce261e" path="/var/lib/kubelet/pods/1cf23e92-a320-42c0-a05e-fe7aa2ce261e/volumes" Mar 18 19:15:05 crc kubenswrapper[5008]: I0318 19:15:05.198181 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:15:05 crc kubenswrapper[5008]: E0318 19:15:05.198727 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:15:19 crc kubenswrapper[5008]: I0318 19:15:19.199594 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:15:19 crc kubenswrapper[5008]: E0318 19:15:19.200808 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:15:32 crc kubenswrapper[5008]: I0318 19:15:32.198448 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:15:32 crc kubenswrapper[5008]: E0318 19:15:32.201392 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:15:43 crc kubenswrapper[5008]: I0318 19:15:43.171827 5008 scope.go:117] "RemoveContainer" containerID="21dc2cf38e138ecedfee59faa41639fb8c8c183a0eb8c1ecb0685e19f7bbde17" Mar 18 19:15:43 crc kubenswrapper[5008]: I0318 19:15:43.198618 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:15:43 crc kubenswrapper[5008]: E0318 19:15:43.198959 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:15:58 crc kubenswrapper[5008]: I0318 19:15:58.198788 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:15:58 crc kubenswrapper[5008]: E0318 19:15:58.199986 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.165933 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564356-8x77l"] Mar 18 19:16:00 crc kubenswrapper[5008]: E0318 19:16:00.167123 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822b9f58-1c50-413a-a8bc-2c2a464db6a4" containerName="collect-profiles" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.167161 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="822b9f58-1c50-413a-a8bc-2c2a464db6a4" containerName="collect-profiles" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.167612 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="822b9f58-1c50-413a-a8bc-2c2a464db6a4" containerName="collect-profiles" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.168547 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564356-8x77l" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.172549 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.172748 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.173166 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.180019 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564356-8x77l"] Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.357668 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfjmf\" (UniqueName: \"kubernetes.io/projected/c960b1d4-9bc5-476a-b009-1b468d6be24f-kube-api-access-vfjmf\") pod \"auto-csr-approver-29564356-8x77l\" (UID: \"c960b1d4-9bc5-476a-b009-1b468d6be24f\") " pod="openshift-infra/auto-csr-approver-29564356-8x77l" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.459996 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfjmf\" (UniqueName: \"kubernetes.io/projected/c960b1d4-9bc5-476a-b009-1b468d6be24f-kube-api-access-vfjmf\") pod \"auto-csr-approver-29564356-8x77l\" (UID: \"c960b1d4-9bc5-476a-b009-1b468d6be24f\") " pod="openshift-infra/auto-csr-approver-29564356-8x77l" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.497878 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfjmf\" (UniqueName: \"kubernetes.io/projected/c960b1d4-9bc5-476a-b009-1b468d6be24f-kube-api-access-vfjmf\") pod \"auto-csr-approver-29564356-8x77l\" (UID: \"c960b1d4-9bc5-476a-b009-1b468d6be24f\") " pod="openshift-infra/auto-csr-approver-29564356-8x77l" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.501589 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564356-8x77l" Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.761758 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564356-8x77l"] Mar 18 19:16:00 crc kubenswrapper[5008]: I0318 19:16:00.987753 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564356-8x77l" event={"ID":"c960b1d4-9bc5-476a-b009-1b468d6be24f","Type":"ContainerStarted","Data":"680f4ba4455c9f9743f2d00e56ffd774d031086f80363757a52c6cdc768b192f"} Mar 18 19:16:03 crc kubenswrapper[5008]: I0318 19:16:03.010228 5008 generic.go:334] "Generic (PLEG): container finished" podID="c960b1d4-9bc5-476a-b009-1b468d6be24f" containerID="bf823b919175eb6cceede839b695ac5f814f6a4b7702733020481c9ed01b8495" exitCode=0 Mar 18 19:16:03 crc kubenswrapper[5008]: I0318 19:16:03.010328 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564356-8x77l" event={"ID":"c960b1d4-9bc5-476a-b009-1b468d6be24f","Type":"ContainerDied","Data":"bf823b919175eb6cceede839b695ac5f814f6a4b7702733020481c9ed01b8495"} Mar 18 19:16:04 crc kubenswrapper[5008]: I0318 19:16:04.348884 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564356-8x77l" Mar 18 19:16:04 crc kubenswrapper[5008]: I0318 19:16:04.428665 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfjmf\" (UniqueName: \"kubernetes.io/projected/c960b1d4-9bc5-476a-b009-1b468d6be24f-kube-api-access-vfjmf\") pod \"c960b1d4-9bc5-476a-b009-1b468d6be24f\" (UID: \"c960b1d4-9bc5-476a-b009-1b468d6be24f\") " Mar 18 19:16:04 crc kubenswrapper[5008]: I0318 19:16:04.434392 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c960b1d4-9bc5-476a-b009-1b468d6be24f-kube-api-access-vfjmf" (OuterVolumeSpecName: "kube-api-access-vfjmf") pod "c960b1d4-9bc5-476a-b009-1b468d6be24f" (UID: "c960b1d4-9bc5-476a-b009-1b468d6be24f"). InnerVolumeSpecName "kube-api-access-vfjmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:16:04 crc kubenswrapper[5008]: I0318 19:16:04.530286 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfjmf\" (UniqueName: \"kubernetes.io/projected/c960b1d4-9bc5-476a-b009-1b468d6be24f-kube-api-access-vfjmf\") on node \"crc\" DevicePath \"\"" Mar 18 19:16:05 crc kubenswrapper[5008]: I0318 19:16:05.037809 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564356-8x77l" event={"ID":"c960b1d4-9bc5-476a-b009-1b468d6be24f","Type":"ContainerDied","Data":"680f4ba4455c9f9743f2d00e56ffd774d031086f80363757a52c6cdc768b192f"} Mar 18 19:16:05 crc kubenswrapper[5008]: I0318 19:16:05.037887 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="680f4ba4455c9f9743f2d00e56ffd774d031086f80363757a52c6cdc768b192f" Mar 18 19:16:05 crc kubenswrapper[5008]: I0318 19:16:05.037926 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564356-8x77l" Mar 18 19:16:05 crc kubenswrapper[5008]: I0318 19:16:05.436458 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564350-msdmw"] Mar 18 19:16:05 crc kubenswrapper[5008]: I0318 19:16:05.446268 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564350-msdmw"] Mar 18 19:16:06 crc kubenswrapper[5008]: I0318 19:16:06.207370 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab807b4-57f7-4a89-9960-6168ded9ee75" path="/var/lib/kubelet/pods/0ab807b4-57f7-4a89-9960-6168ded9ee75/volumes" Mar 18 19:16:10 crc kubenswrapper[5008]: I0318 19:16:10.198422 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:16:10 crc kubenswrapper[5008]: E0318 19:16:10.199476 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:16:23 crc kubenswrapper[5008]: I0318 19:16:23.199299 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:16:23 crc kubenswrapper[5008]: E0318 19:16:23.201014 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:16:37 crc kubenswrapper[5008]: I0318 19:16:37.198738 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:16:37 crc kubenswrapper[5008]: E0318 19:16:37.199939 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:16:43 crc kubenswrapper[5008]: I0318 19:16:43.220689 5008 scope.go:117] "RemoveContainer" containerID="db08d893569f33aa9300f5bd9a940a8f11ad957fca9fc0f2752dea94fb6ccc83" Mar 18 19:16:48 crc kubenswrapper[5008]: I0318 19:16:48.198978 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:16:48 crc kubenswrapper[5008]: E0318 19:16:48.199888 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:17:00 crc kubenswrapper[5008]: I0318 19:17:00.198518 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:17:00 crc kubenswrapper[5008]: E0318 19:17:00.199894 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:17:13 crc kubenswrapper[5008]: I0318 19:17:13.197773 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:17:13 crc kubenswrapper[5008]: E0318 19:17:13.198457 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:17:26 crc kubenswrapper[5008]: I0318 19:17:26.198739 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:17:26 crc kubenswrapper[5008]: E0318 19:17:26.199891 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:17:41 crc kubenswrapper[5008]: I0318 19:17:41.198929 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:17:41 crc kubenswrapper[5008]: E0318 19:17:41.200009 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:17:53 crc kubenswrapper[5008]: I0318 19:17:53.198049 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:17:53 crc kubenswrapper[5008]: E0318 19:17:53.198741 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.167122 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564358-m5dff"] Mar 18 19:18:00 crc kubenswrapper[5008]: E0318 19:18:00.168639 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c960b1d4-9bc5-476a-b009-1b468d6be24f" containerName="oc" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.168675 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c960b1d4-9bc5-476a-b009-1b468d6be24f" containerName="oc" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.169039 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c960b1d4-9bc5-476a-b009-1b468d6be24f" containerName="oc" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.170276 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564358-m5dff" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.175665 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.175717 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.175835 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.181480 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564358-m5dff"] Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.325778 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pffqp\" (UniqueName: \"kubernetes.io/projected/52398f8d-42bf-41c8-87d6-01abb0f79a04-kube-api-access-pffqp\") pod \"auto-csr-approver-29564358-m5dff\" (UID: \"52398f8d-42bf-41c8-87d6-01abb0f79a04\") " pod="openshift-infra/auto-csr-approver-29564358-m5dff" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.428232 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pffqp\" (UniqueName: \"kubernetes.io/projected/52398f8d-42bf-41c8-87d6-01abb0f79a04-kube-api-access-pffqp\") pod \"auto-csr-approver-29564358-m5dff\" (UID: \"52398f8d-42bf-41c8-87d6-01abb0f79a04\") " pod="openshift-infra/auto-csr-approver-29564358-m5dff" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.467267 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pffqp\" (UniqueName: \"kubernetes.io/projected/52398f8d-42bf-41c8-87d6-01abb0f79a04-kube-api-access-pffqp\") pod \"auto-csr-approver-29564358-m5dff\" (UID: \"52398f8d-42bf-41c8-87d6-01abb0f79a04\") " pod="openshift-infra/auto-csr-approver-29564358-m5dff" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.504309 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564358-m5dff" Mar 18 19:18:00 crc kubenswrapper[5008]: I0318 19:18:00.968042 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564358-m5dff"] Mar 18 19:18:01 crc kubenswrapper[5008]: I0318 19:18:01.042689 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564358-m5dff" event={"ID":"52398f8d-42bf-41c8-87d6-01abb0f79a04","Type":"ContainerStarted","Data":"34016a54ed17de2d1838c04f7513e4bb3774e9710a78b4831302faee29e9901e"} Mar 18 19:18:03 crc kubenswrapper[5008]: I0318 19:18:03.065294 5008 generic.go:334] "Generic (PLEG): container finished" podID="52398f8d-42bf-41c8-87d6-01abb0f79a04" containerID="a749c07a415c12f78e7aa072e5e914dceabc4e883ab02814fff9eaddd27f819d" exitCode=0 Mar 18 19:18:03 crc kubenswrapper[5008]: I0318 19:18:03.066031 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564358-m5dff" event={"ID":"52398f8d-42bf-41c8-87d6-01abb0f79a04","Type":"ContainerDied","Data":"a749c07a415c12f78e7aa072e5e914dceabc4e883ab02814fff9eaddd27f819d"} Mar 18 19:18:04 crc kubenswrapper[5008]: I0318 19:18:04.417877 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564358-m5dff" Mar 18 19:18:04 crc kubenswrapper[5008]: I0318 19:18:04.528926 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pffqp\" (UniqueName: \"kubernetes.io/projected/52398f8d-42bf-41c8-87d6-01abb0f79a04-kube-api-access-pffqp\") pod \"52398f8d-42bf-41c8-87d6-01abb0f79a04\" (UID: \"52398f8d-42bf-41c8-87d6-01abb0f79a04\") " Mar 18 19:18:04 crc kubenswrapper[5008]: I0318 19:18:04.535082 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52398f8d-42bf-41c8-87d6-01abb0f79a04-kube-api-access-pffqp" (OuterVolumeSpecName: "kube-api-access-pffqp") pod "52398f8d-42bf-41c8-87d6-01abb0f79a04" (UID: "52398f8d-42bf-41c8-87d6-01abb0f79a04"). InnerVolumeSpecName "kube-api-access-pffqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:18:04 crc kubenswrapper[5008]: I0318 19:18:04.630431 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pffqp\" (UniqueName: \"kubernetes.io/projected/52398f8d-42bf-41c8-87d6-01abb0f79a04-kube-api-access-pffqp\") on node \"crc\" DevicePath \"\"" Mar 18 19:18:05 crc kubenswrapper[5008]: I0318 19:18:05.091489 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564358-m5dff" event={"ID":"52398f8d-42bf-41c8-87d6-01abb0f79a04","Type":"ContainerDied","Data":"34016a54ed17de2d1838c04f7513e4bb3774e9710a78b4831302faee29e9901e"} Mar 18 19:18:05 crc kubenswrapper[5008]: I0318 19:18:05.091601 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564358-m5dff" Mar 18 19:18:05 crc kubenswrapper[5008]: I0318 19:18:05.091992 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34016a54ed17de2d1838c04f7513e4bb3774e9710a78b4831302faee29e9901e" Mar 18 19:18:05 crc kubenswrapper[5008]: I0318 19:18:05.198476 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:18:05 crc kubenswrapper[5008]: E0318 19:18:05.198852 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:18:05 crc kubenswrapper[5008]: I0318 19:18:05.508277 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564352-cl8cg"] Mar 18 19:18:05 crc kubenswrapper[5008]: I0318 19:18:05.519957 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564352-cl8cg"] Mar 18 19:18:06 crc kubenswrapper[5008]: I0318 19:18:06.214217 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7adc401c-a2de-495c-8bd5-a6979e497ead" path="/var/lib/kubelet/pods/7adc401c-a2de-495c-8bd5-a6979e497ead/volumes" Mar 18 19:18:17 crc kubenswrapper[5008]: I0318 19:18:17.199000 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:18:17 crc kubenswrapper[5008]: E0318 19:18:17.200873 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:18:29 crc kubenswrapper[5008]: I0318 19:18:29.198763 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:18:29 crc kubenswrapper[5008]: E0318 19:18:29.200089 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:18:42 crc kubenswrapper[5008]: I0318 19:18:42.202891 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:18:42 crc kubenswrapper[5008]: E0318 19:18:42.203613 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:18:43 crc kubenswrapper[5008]: I0318 19:18:43.349164 5008 scope.go:117] "RemoveContainer" containerID="aa1ccc47f6467659a5cd35841ee4aa57415399474e80bf0862afdfd514f7d0c6" Mar 18 19:18:56 crc kubenswrapper[5008]: I0318 19:18:56.198925 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:18:56 crc kubenswrapper[5008]: E0318 19:18:56.199807 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:19:08 crc kubenswrapper[5008]: I0318 19:19:08.198692 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:19:08 crc kubenswrapper[5008]: E0318 19:19:08.199548 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.478004 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-wrdt5"] Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.486359 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-wrdt5"] Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.600912 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-p2tsb"] Mar 18 19:19:18 crc kubenswrapper[5008]: E0318 19:19:18.601825 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52398f8d-42bf-41c8-87d6-01abb0f79a04" containerName="oc" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.601970 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="52398f8d-42bf-41c8-87d6-01abb0f79a04" containerName="oc" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.602234 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="52398f8d-42bf-41c8-87d6-01abb0f79a04" containerName="oc" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.602886 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.605955 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.606382 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.606601 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.608360 5008 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-mk8ls" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.612951 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p2tsb"] Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.746016 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4d26\" (UniqueName: \"kubernetes.io/projected/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-kube-api-access-c4d26\") pod \"crc-storage-crc-p2tsb\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.746326 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-crc-storage\") pod \"crc-storage-crc-p2tsb\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.746457 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-node-mnt\") pod \"crc-storage-crc-p2tsb\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.848539 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4d26\" (UniqueName: \"kubernetes.io/projected/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-kube-api-access-c4d26\") pod \"crc-storage-crc-p2tsb\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.848738 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-crc-storage\") pod \"crc-storage-crc-p2tsb\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.848824 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-node-mnt\") pod \"crc-storage-crc-p2tsb\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.849422 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-node-mnt\") pod \"crc-storage-crc-p2tsb\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.850988 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-crc-storage\") pod \"crc-storage-crc-p2tsb\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.884869 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4d26\" (UniqueName: \"kubernetes.io/projected/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-kube-api-access-c4d26\") pod \"crc-storage-crc-p2tsb\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:18 crc kubenswrapper[5008]: I0318 19:19:18.926863 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:19 crc kubenswrapper[5008]: I0318 19:19:19.421636 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p2tsb"] Mar 18 19:19:19 crc kubenswrapper[5008]: I0318 19:19:19.431402 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:19:19 crc kubenswrapper[5008]: I0318 19:19:19.791934 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2tsb" event={"ID":"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4","Type":"ContainerStarted","Data":"cf75e7d86fd084194691e4302cf053921c0511dec04bb2b5913923fffd4bb1cc"} Mar 18 19:19:20 crc kubenswrapper[5008]: I0318 19:19:20.214929 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d783fba-34bb-4969-a6eb-59c96cd838f6" path="/var/lib/kubelet/pods/2d783fba-34bb-4969-a6eb-59c96cd838f6/volumes" Mar 18 19:19:20 crc kubenswrapper[5008]: I0318 19:19:20.799685 5008 generic.go:334] "Generic (PLEG): container finished" podID="82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4" containerID="c9d227e75affc9bd49500787dd7499aea5f14d0fde407bf55af0e651a47a380d" exitCode=0 Mar 18 19:19:20 crc kubenswrapper[5008]: I0318 19:19:20.799769 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2tsb" event={"ID":"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4","Type":"ContainerDied","Data":"c9d227e75affc9bd49500787dd7499aea5f14d0fde407bf55af0e651a47a380d"} Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.198397 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:19:22 crc kubenswrapper[5008]: E0318 19:19:22.199288 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.237701 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.413404 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-crc-storage\") pod \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.413477 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4d26\" (UniqueName: \"kubernetes.io/projected/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-kube-api-access-c4d26\") pod \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.413638 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-node-mnt\") pod \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\" (UID: \"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4\") " Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.413774 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4" (UID: "82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.413946 5008 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.420013 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-kube-api-access-c4d26" (OuterVolumeSpecName: "kube-api-access-c4d26") pod "82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4" (UID: "82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4"). InnerVolumeSpecName "kube-api-access-c4d26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.435305 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4" (UID: "82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.515294 5008 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.515332 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4d26\" (UniqueName: \"kubernetes.io/projected/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4-kube-api-access-c4d26\") on node \"crc\" DevicePath \"\"" Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.818502 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2tsb" event={"ID":"82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4","Type":"ContainerDied","Data":"cf75e7d86fd084194691e4302cf053921c0511dec04bb2b5913923fffd4bb1cc"} Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.818595 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf75e7d86fd084194691e4302cf053921c0511dec04bb2b5913923fffd4bb1cc" Mar 18 19:19:22 crc kubenswrapper[5008]: I0318 19:19:22.819320 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2tsb" Mar 18 19:19:24 crc kubenswrapper[5008]: I0318 19:19:24.886547 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-p2tsb"] Mar 18 19:19:24 crc kubenswrapper[5008]: I0318 19:19:24.893659 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-p2tsb"] Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.077006 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-srcp5"] Mar 18 19:19:25 crc kubenswrapper[5008]: E0318 19:19:25.077676 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4" containerName="storage" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.077710 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4" containerName="storage" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.078044 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4" containerName="storage" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.078993 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.083306 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.086149 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.086483 5008 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-mk8ls" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.086783 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.087907 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-srcp5"] Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.259180 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4f4e45c3-496a-400a-8f59-40f97c62e1e7-crc-storage\") pod \"crc-storage-crc-srcp5\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.259276 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4f4e45c3-496a-400a-8f59-40f97c62e1e7-node-mnt\") pod \"crc-storage-crc-srcp5\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.259442 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcnql\" (UniqueName: \"kubernetes.io/projected/4f4e45c3-496a-400a-8f59-40f97c62e1e7-kube-api-access-bcnql\") pod \"crc-storage-crc-srcp5\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.361278 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcnql\" (UniqueName: \"kubernetes.io/projected/4f4e45c3-496a-400a-8f59-40f97c62e1e7-kube-api-access-bcnql\") pod \"crc-storage-crc-srcp5\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.361399 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4f4e45c3-496a-400a-8f59-40f97c62e1e7-crc-storage\") pod \"crc-storage-crc-srcp5\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.361454 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4f4e45c3-496a-400a-8f59-40f97c62e1e7-node-mnt\") pod \"crc-storage-crc-srcp5\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.361936 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4f4e45c3-496a-400a-8f59-40f97c62e1e7-node-mnt\") pod \"crc-storage-crc-srcp5\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.362716 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4f4e45c3-496a-400a-8f59-40f97c62e1e7-crc-storage\") pod \"crc-storage-crc-srcp5\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.385232 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcnql\" (UniqueName: \"kubernetes.io/projected/4f4e45c3-496a-400a-8f59-40f97c62e1e7-kube-api-access-bcnql\") pod \"crc-storage-crc-srcp5\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.404545 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:25 crc kubenswrapper[5008]: I0318 19:19:25.884630 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-srcp5"] Mar 18 19:19:26 crc kubenswrapper[5008]: I0318 19:19:26.215950 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4" path="/var/lib/kubelet/pods/82f6f027-f0d8-4e6a-8bd1-2afa9b5479e4/volumes" Mar 18 19:19:26 crc kubenswrapper[5008]: I0318 19:19:26.859720 5008 generic.go:334] "Generic (PLEG): container finished" podID="4f4e45c3-496a-400a-8f59-40f97c62e1e7" containerID="a158adbb01c4eadad7e7991aef539c2e60b81244d8ffca7b31f32e5d9c7c56a2" exitCode=0 Mar 18 19:19:26 crc kubenswrapper[5008]: I0318 19:19:26.860027 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-srcp5" event={"ID":"4f4e45c3-496a-400a-8f59-40f97c62e1e7","Type":"ContainerDied","Data":"a158adbb01c4eadad7e7991aef539c2e60b81244d8ffca7b31f32e5d9c7c56a2"} Mar 18 19:19:26 crc kubenswrapper[5008]: I0318 19:19:26.860058 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-srcp5" event={"ID":"4f4e45c3-496a-400a-8f59-40f97c62e1e7","Type":"ContainerStarted","Data":"9b3b251e6550eefef3ebe2147f8d18a1fa1cd418d3ffc15dfdfecde0a4b2d8e2"} Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.232895 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.409619 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4f4e45c3-496a-400a-8f59-40f97c62e1e7-node-mnt\") pod \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.409688 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcnql\" (UniqueName: \"kubernetes.io/projected/4f4e45c3-496a-400a-8f59-40f97c62e1e7-kube-api-access-bcnql\") pod \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.409769 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4f4e45c3-496a-400a-8f59-40f97c62e1e7-crc-storage\") pod \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\" (UID: \"4f4e45c3-496a-400a-8f59-40f97c62e1e7\") " Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.409755 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f4e45c3-496a-400a-8f59-40f97c62e1e7-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "4f4e45c3-496a-400a-8f59-40f97c62e1e7" (UID: "4f4e45c3-496a-400a-8f59-40f97c62e1e7"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.409982 5008 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4f4e45c3-496a-400a-8f59-40f97c62e1e7-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.414427 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4e45c3-496a-400a-8f59-40f97c62e1e7-kube-api-access-bcnql" (OuterVolumeSpecName: "kube-api-access-bcnql") pod "4f4e45c3-496a-400a-8f59-40f97c62e1e7" (UID: "4f4e45c3-496a-400a-8f59-40f97c62e1e7"). InnerVolumeSpecName "kube-api-access-bcnql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.429020 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f4e45c3-496a-400a-8f59-40f97c62e1e7-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "4f4e45c3-496a-400a-8f59-40f97c62e1e7" (UID: "4f4e45c3-496a-400a-8f59-40f97c62e1e7"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.511864 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcnql\" (UniqueName: \"kubernetes.io/projected/4f4e45c3-496a-400a-8f59-40f97c62e1e7-kube-api-access-bcnql\") on node \"crc\" DevicePath \"\"" Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.511905 5008 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4f4e45c3-496a-400a-8f59-40f97c62e1e7-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.879901 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-srcp5" event={"ID":"4f4e45c3-496a-400a-8f59-40f97c62e1e7","Type":"ContainerDied","Data":"9b3b251e6550eefef3ebe2147f8d18a1fa1cd418d3ffc15dfdfecde0a4b2d8e2"} Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.879941 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b3b251e6550eefef3ebe2147f8d18a1fa1cd418d3ffc15dfdfecde0a4b2d8e2" Mar 18 19:19:28 crc kubenswrapper[5008]: I0318 19:19:28.879970 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srcp5" Mar 18 19:19:36 crc kubenswrapper[5008]: I0318 19:19:36.199072 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:19:36 crc kubenswrapper[5008]: I0318 19:19:36.947175 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"4d595231fbacf4e90ffb123dbddc3f5bb05b324fc9dae73ed4f109d00d75ea52"} Mar 18 19:19:43 crc kubenswrapper[5008]: I0318 19:19:43.439656 5008 scope.go:117] "RemoveContainer" containerID="c21904ad77e04ffffcedfc97a0b530ee6c7a3f64dac9b6dcde055b60ae8aeae5" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.165376 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564360-v5fqc"] Mar 18 19:20:00 crc kubenswrapper[5008]: E0318 19:20:00.166367 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4e45c3-496a-400a-8f59-40f97c62e1e7" containerName="storage" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.166385 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4e45c3-496a-400a-8f59-40f97c62e1e7" containerName="storage" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.166632 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4e45c3-496a-400a-8f59-40f97c62e1e7" containerName="storage" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.167176 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564360-v5fqc" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.172188 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.173128 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.173277 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.178633 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564360-v5fqc"] Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.212322 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvg7b\" (UniqueName: \"kubernetes.io/projected/8db7a27e-6ce2-4068-aff2-0f476262ec37-kube-api-access-gvg7b\") pod \"auto-csr-approver-29564360-v5fqc\" (UID: \"8db7a27e-6ce2-4068-aff2-0f476262ec37\") " pod="openshift-infra/auto-csr-approver-29564360-v5fqc" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.313761 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvg7b\" (UniqueName: \"kubernetes.io/projected/8db7a27e-6ce2-4068-aff2-0f476262ec37-kube-api-access-gvg7b\") pod \"auto-csr-approver-29564360-v5fqc\" (UID: \"8db7a27e-6ce2-4068-aff2-0f476262ec37\") " pod="openshift-infra/auto-csr-approver-29564360-v5fqc" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.337162 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvg7b\" (UniqueName: \"kubernetes.io/projected/8db7a27e-6ce2-4068-aff2-0f476262ec37-kube-api-access-gvg7b\") pod \"auto-csr-approver-29564360-v5fqc\" (UID: \"8db7a27e-6ce2-4068-aff2-0f476262ec37\") " pod="openshift-infra/auto-csr-approver-29564360-v5fqc" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.510238 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564360-v5fqc" Mar 18 19:20:00 crc kubenswrapper[5008]: I0318 19:20:00.965105 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564360-v5fqc"] Mar 18 19:20:01 crc kubenswrapper[5008]: I0318 19:20:01.153522 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564360-v5fqc" event={"ID":"8db7a27e-6ce2-4068-aff2-0f476262ec37","Type":"ContainerStarted","Data":"2b644d360f3c7277af3d8add74269184ba68178c37b83b0d51480e7697d9b962"} Mar 18 19:20:03 crc kubenswrapper[5008]: I0318 19:20:03.169178 5008 generic.go:334] "Generic (PLEG): container finished" podID="8db7a27e-6ce2-4068-aff2-0f476262ec37" containerID="8536dbd92da6ca89cfc54e81f61eeec43f9d339c033b1fb2ef4cbb3acf8e4bc5" exitCode=0 Mar 18 19:20:03 crc kubenswrapper[5008]: I0318 19:20:03.169300 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564360-v5fqc" event={"ID":"8db7a27e-6ce2-4068-aff2-0f476262ec37","Type":"ContainerDied","Data":"8536dbd92da6ca89cfc54e81f61eeec43f9d339c033b1fb2ef4cbb3acf8e4bc5"} Mar 18 19:20:04 crc kubenswrapper[5008]: I0318 19:20:04.473291 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564360-v5fqc" Mar 18 19:20:04 crc kubenswrapper[5008]: I0318 19:20:04.579425 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvg7b\" (UniqueName: \"kubernetes.io/projected/8db7a27e-6ce2-4068-aff2-0f476262ec37-kube-api-access-gvg7b\") pod \"8db7a27e-6ce2-4068-aff2-0f476262ec37\" (UID: \"8db7a27e-6ce2-4068-aff2-0f476262ec37\") " Mar 18 19:20:04 crc kubenswrapper[5008]: I0318 19:20:04.589414 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db7a27e-6ce2-4068-aff2-0f476262ec37-kube-api-access-gvg7b" (OuterVolumeSpecName: "kube-api-access-gvg7b") pod "8db7a27e-6ce2-4068-aff2-0f476262ec37" (UID: "8db7a27e-6ce2-4068-aff2-0f476262ec37"). InnerVolumeSpecName "kube-api-access-gvg7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:20:04 crc kubenswrapper[5008]: I0318 19:20:04.681690 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvg7b\" (UniqueName: \"kubernetes.io/projected/8db7a27e-6ce2-4068-aff2-0f476262ec37-kube-api-access-gvg7b\") on node \"crc\" DevicePath \"\"" Mar 18 19:20:05 crc kubenswrapper[5008]: I0318 19:20:05.187307 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564360-v5fqc" event={"ID":"8db7a27e-6ce2-4068-aff2-0f476262ec37","Type":"ContainerDied","Data":"2b644d360f3c7277af3d8add74269184ba68178c37b83b0d51480e7697d9b962"} Mar 18 19:20:05 crc kubenswrapper[5008]: I0318 19:20:05.187346 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b644d360f3c7277af3d8add74269184ba68178c37b83b0d51480e7697d9b962" Mar 18 19:20:05 crc kubenswrapper[5008]: I0318 19:20:05.187402 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564360-v5fqc" Mar 18 19:20:05 crc kubenswrapper[5008]: I0318 19:20:05.554288 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564354-rwscz"] Mar 18 19:20:05 crc kubenswrapper[5008]: I0318 19:20:05.578424 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564354-rwscz"] Mar 18 19:20:06 crc kubenswrapper[5008]: I0318 19:20:06.227394 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f799ff0-38de-4d07-b0f2-92a945600a96" path="/var/lib/kubelet/pods/6f799ff0-38de-4d07-b0f2-92a945600a96/volumes" Mar 18 19:20:44 crc kubenswrapper[5008]: I0318 19:20:44.113506 5008 scope.go:117] "RemoveContainer" containerID="3748860a1caf8ecf49680e80e5b559635bb3c20b2b3e62f55a39b931b99db9ee" Mar 18 19:21:54 crc kubenswrapper[5008]: I0318 19:21:54.460156 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:21:54 crc kubenswrapper[5008]: I0318 19:21:54.460900 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.167694 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564362-sphbx"] Mar 18 19:22:00 crc kubenswrapper[5008]: E0318 19:22:00.183070 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db7a27e-6ce2-4068-aff2-0f476262ec37" containerName="oc" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.183128 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db7a27e-6ce2-4068-aff2-0f476262ec37" containerName="oc" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.184002 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db7a27e-6ce2-4068-aff2-0f476262ec37" containerName="oc" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.185286 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564362-sphbx" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.187645 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564362-sphbx"] Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.189404 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.189673 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.189998 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.246295 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfblc\" (UniqueName: \"kubernetes.io/projected/5b9d9d1e-744e-4aa0-970c-c17d42363ac4-kube-api-access-vfblc\") pod \"auto-csr-approver-29564362-sphbx\" (UID: \"5b9d9d1e-744e-4aa0-970c-c17d42363ac4\") " pod="openshift-infra/auto-csr-approver-29564362-sphbx" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.348909 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfblc\" (UniqueName: \"kubernetes.io/projected/5b9d9d1e-744e-4aa0-970c-c17d42363ac4-kube-api-access-vfblc\") pod \"auto-csr-approver-29564362-sphbx\" (UID: \"5b9d9d1e-744e-4aa0-970c-c17d42363ac4\") " pod="openshift-infra/auto-csr-approver-29564362-sphbx" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.375894 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfblc\" (UniqueName: \"kubernetes.io/projected/5b9d9d1e-744e-4aa0-970c-c17d42363ac4-kube-api-access-vfblc\") pod \"auto-csr-approver-29564362-sphbx\" (UID: \"5b9d9d1e-744e-4aa0-970c-c17d42363ac4\") " pod="openshift-infra/auto-csr-approver-29564362-sphbx" Mar 18 19:22:00 crc kubenswrapper[5008]: I0318 19:22:00.509306 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564362-sphbx" Mar 18 19:22:01 crc kubenswrapper[5008]: I0318 19:22:01.059965 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564362-sphbx"] Mar 18 19:22:01 crc kubenswrapper[5008]: I0318 19:22:01.189878 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564362-sphbx" event={"ID":"5b9d9d1e-744e-4aa0-970c-c17d42363ac4","Type":"ContainerStarted","Data":"dd91fdafd7d5898a9190f34c5c44e98e3794d3282737543c38bfefdc2a531aa2"} Mar 18 19:22:03 crc kubenswrapper[5008]: I0318 19:22:03.209521 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564362-sphbx" event={"ID":"5b9d9d1e-744e-4aa0-970c-c17d42363ac4","Type":"ContainerStarted","Data":"83962fc7c9e70fcff217792fddca4f6b2eab63e53d979a327a2fbbae8c221b82"} Mar 18 19:22:03 crc kubenswrapper[5008]: I0318 19:22:03.232657 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564362-sphbx" podStartSLOduration=1.7820828199999998 podStartE2EDuration="3.232637709s" podCreationTimestamp="2026-03-18 19:22:00 +0000 UTC" firstStartedPulling="2026-03-18 19:22:01.074997025 +0000 UTC m=+4777.594470144" lastFinishedPulling="2026-03-18 19:22:02.525551914 +0000 UTC m=+4779.045025033" observedRunningTime="2026-03-18 19:22:03.226493561 +0000 UTC m=+4779.745966730" watchObservedRunningTime="2026-03-18 19:22:03.232637709 +0000 UTC m=+4779.752110788" Mar 18 19:22:04 crc kubenswrapper[5008]: I0318 19:22:04.219289 5008 generic.go:334] "Generic (PLEG): container finished" podID="5b9d9d1e-744e-4aa0-970c-c17d42363ac4" containerID="83962fc7c9e70fcff217792fddca4f6b2eab63e53d979a327a2fbbae8c221b82" exitCode=0 Mar 18 19:22:04 crc kubenswrapper[5008]: I0318 19:22:04.219629 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564362-sphbx" event={"ID":"5b9d9d1e-744e-4aa0-970c-c17d42363ac4","Type":"ContainerDied","Data":"83962fc7c9e70fcff217792fddca4f6b2eab63e53d979a327a2fbbae8c221b82"} Mar 18 19:22:05 crc kubenswrapper[5008]: I0318 19:22:05.604531 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564362-sphbx" Mar 18 19:22:05 crc kubenswrapper[5008]: I0318 19:22:05.625345 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfblc\" (UniqueName: \"kubernetes.io/projected/5b9d9d1e-744e-4aa0-970c-c17d42363ac4-kube-api-access-vfblc\") pod \"5b9d9d1e-744e-4aa0-970c-c17d42363ac4\" (UID: \"5b9d9d1e-744e-4aa0-970c-c17d42363ac4\") " Mar 18 19:22:05 crc kubenswrapper[5008]: I0318 19:22:05.632153 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9d9d1e-744e-4aa0-970c-c17d42363ac4-kube-api-access-vfblc" (OuterVolumeSpecName: "kube-api-access-vfblc") pod "5b9d9d1e-744e-4aa0-970c-c17d42363ac4" (UID: "5b9d9d1e-744e-4aa0-970c-c17d42363ac4"). InnerVolumeSpecName "kube-api-access-vfblc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:22:05 crc kubenswrapper[5008]: I0318 19:22:05.728072 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfblc\" (UniqueName: \"kubernetes.io/projected/5b9d9d1e-744e-4aa0-970c-c17d42363ac4-kube-api-access-vfblc\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:06 crc kubenswrapper[5008]: I0318 19:22:06.258365 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564362-sphbx" event={"ID":"5b9d9d1e-744e-4aa0-970c-c17d42363ac4","Type":"ContainerDied","Data":"dd91fdafd7d5898a9190f34c5c44e98e3794d3282737543c38bfefdc2a531aa2"} Mar 18 19:22:06 crc kubenswrapper[5008]: I0318 19:22:06.258593 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd91fdafd7d5898a9190f34c5c44e98e3794d3282737543c38bfefdc2a531aa2" Mar 18 19:22:06 crc kubenswrapper[5008]: I0318 19:22:06.258509 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564362-sphbx" Mar 18 19:22:06 crc kubenswrapper[5008]: I0318 19:22:06.303132 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564356-8x77l"] Mar 18 19:22:06 crc kubenswrapper[5008]: I0318 19:22:06.307945 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564356-8x77l"] Mar 18 19:22:08 crc kubenswrapper[5008]: I0318 19:22:08.215975 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c960b1d4-9bc5-476a-b009-1b468d6be24f" path="/var/lib/kubelet/pods/c960b1d4-9bc5-476a-b009-1b468d6be24f/volumes" Mar 18 19:22:24 crc kubenswrapper[5008]: I0318 19:22:24.460245 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:22:24 crc kubenswrapper[5008]: I0318 19:22:24.460796 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.106966 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-b4lws"] Mar 18 19:22:34 crc kubenswrapper[5008]: E0318 19:22:34.107799 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9d9d1e-744e-4aa0-970c-c17d42363ac4" containerName="oc" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.107814 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9d9d1e-744e-4aa0-970c-c17d42363ac4" containerName="oc" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.107952 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b9d9d1e-744e-4aa0-970c-c17d42363ac4" containerName="oc" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.108673 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.114531 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-h7h9w"] Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.114763 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.115002 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.115150 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.115980 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-dqltt" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.116176 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.117115 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.121366 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-b4lws"] Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.137946 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-h7h9w"] Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.174529 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-config\") pod \"dnsmasq-dns-76f4889f87-b4lws\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.174646 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n94p6\" (UniqueName: \"kubernetes.io/projected/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-kube-api-access-n94p6\") pod \"dnsmasq-dns-78dcc4d9b5-h7h9w\" (UID: \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.174740 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-dns-svc\") pod \"dnsmasq-dns-76f4889f87-b4lws\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.174763 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4kg6\" (UniqueName: \"kubernetes.io/projected/c5db3c58-8cf6-4ddc-90e4-c3602f536570-kube-api-access-p4kg6\") pod \"dnsmasq-dns-76f4889f87-b4lws\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.174864 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-config\") pod \"dnsmasq-dns-78dcc4d9b5-h7h9w\" (UID: \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.228313 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-b4lws"] Mar 18 19:22:34 crc kubenswrapper[5008]: E0318 19:22:34.229018 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-p4kg6], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-76f4889f87-b4lws" podUID="c5db3c58-8cf6-4ddc-90e4-c3602f536570" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.251375 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-hl49w"] Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.252443 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.266865 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-hl49w"] Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.275670 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-dns-svc\") pod \"dnsmasq-dns-76f4889f87-b4lws\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.275709 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqvmd\" (UniqueName: \"kubernetes.io/projected/300542d1-fd7f-480c-91d1-36437da2a5b4-kube-api-access-wqvmd\") pod \"dnsmasq-dns-6cfbf56dd9-hl49w\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.275734 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4kg6\" (UniqueName: \"kubernetes.io/projected/c5db3c58-8cf6-4ddc-90e4-c3602f536570-kube-api-access-p4kg6\") pod \"dnsmasq-dns-76f4889f87-b4lws\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.275765 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-config\") pod \"dnsmasq-dns-78dcc4d9b5-h7h9w\" (UID: \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.275789 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-dns-svc\") pod \"dnsmasq-dns-6cfbf56dd9-hl49w\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.275819 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-config\") pod \"dnsmasq-dns-76f4889f87-b4lws\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.275898 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-config\") pod \"dnsmasq-dns-6cfbf56dd9-hl49w\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.276096 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n94p6\" (UniqueName: \"kubernetes.io/projected/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-kube-api-access-n94p6\") pod \"dnsmasq-dns-78dcc4d9b5-h7h9w\" (UID: \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.276937 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-dns-svc\") pod \"dnsmasq-dns-76f4889f87-b4lws\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.276987 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-config\") pod \"dnsmasq-dns-78dcc4d9b5-h7h9w\" (UID: \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.277113 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-config\") pod \"dnsmasq-dns-76f4889f87-b4lws\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.300509 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n94p6\" (UniqueName: \"kubernetes.io/projected/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-kube-api-access-n94p6\") pod \"dnsmasq-dns-78dcc4d9b5-h7h9w\" (UID: \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.300529 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4kg6\" (UniqueName: \"kubernetes.io/projected/c5db3c58-8cf6-4ddc-90e4-c3602f536570-kube-api-access-p4kg6\") pod \"dnsmasq-dns-76f4889f87-b4lws\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.378021 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-dns-svc\") pod \"dnsmasq-dns-6cfbf56dd9-hl49w\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.378086 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-config\") pod \"dnsmasq-dns-6cfbf56dd9-hl49w\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.378152 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqvmd\" (UniqueName: \"kubernetes.io/projected/300542d1-fd7f-480c-91d1-36437da2a5b4-kube-api-access-wqvmd\") pod \"dnsmasq-dns-6cfbf56dd9-hl49w\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.379263 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-dns-svc\") pod \"dnsmasq-dns-6cfbf56dd9-hl49w\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.379578 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-config\") pod \"dnsmasq-dns-6cfbf56dd9-hl49w\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.402195 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqvmd\" (UniqueName: \"kubernetes.io/projected/300542d1-fd7f-480c-91d1-36437da2a5b4-kube-api-access-wqvmd\") pod \"dnsmasq-dns-6cfbf56dd9-hl49w\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.442100 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.522179 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.543404 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-hl49w"] Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.544147 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.571611 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-dgh2j"] Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.571773 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.572776 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.578721 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-dgh2j"] Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.579713 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-dns-svc\") pod \"dnsmasq-dns-7c95686bd5-dgh2j\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.579864 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhd6f\" (UniqueName: \"kubernetes.io/projected/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-kube-api-access-jhd6f\") pod \"dnsmasq-dns-7c95686bd5-dgh2j\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.579950 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-config\") pod \"dnsmasq-dns-7c95686bd5-dgh2j\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.683211 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-dns-svc\") pod \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.683583 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-config\") pod \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.683650 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4kg6\" (UniqueName: \"kubernetes.io/projected/c5db3c58-8cf6-4ddc-90e4-c3602f536570-kube-api-access-p4kg6\") pod \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\" (UID: \"c5db3c58-8cf6-4ddc-90e4-c3602f536570\") " Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.683817 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhd6f\" (UniqueName: \"kubernetes.io/projected/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-kube-api-access-jhd6f\") pod \"dnsmasq-dns-7c95686bd5-dgh2j\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.683870 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-config\") pod \"dnsmasq-dns-7c95686bd5-dgh2j\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.683911 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-dns-svc\") pod \"dnsmasq-dns-7c95686bd5-dgh2j\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.683923 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5db3c58-8cf6-4ddc-90e4-c3602f536570" (UID: "c5db3c58-8cf6-4ddc-90e4-c3602f536570"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.684030 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.685185 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-config\") pod \"dnsmasq-dns-7c95686bd5-dgh2j\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.685523 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-dns-svc\") pod \"dnsmasq-dns-7c95686bd5-dgh2j\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.691063 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-config" (OuterVolumeSpecName: "config") pod "c5db3c58-8cf6-4ddc-90e4-c3602f536570" (UID: "c5db3c58-8cf6-4ddc-90e4-c3602f536570"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.697266 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5db3c58-8cf6-4ddc-90e4-c3602f536570-kube-api-access-p4kg6" (OuterVolumeSpecName: "kube-api-access-p4kg6") pod "c5db3c58-8cf6-4ddc-90e4-c3602f536570" (UID: "c5db3c58-8cf6-4ddc-90e4-c3602f536570"). InnerVolumeSpecName "kube-api-access-p4kg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.704966 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhd6f\" (UniqueName: \"kubernetes.io/projected/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-kube-api-access-jhd6f\") pod \"dnsmasq-dns-7c95686bd5-dgh2j\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.785527 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db3c58-8cf6-4ddc-90e4-c3602f536570-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.785580 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4kg6\" (UniqueName: \"kubernetes.io/projected/c5db3c58-8cf6-4ddc-90e4-c3602f536570-kube-api-access-p4kg6\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.814256 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-h7h9w"] Mar 18 19:22:34 crc kubenswrapper[5008]: I0318 19:22:34.898888 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.079481 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-hl49w"] Mar 18 19:22:35 crc kubenswrapper[5008]: W0318 19:22:35.079627 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod300542d1_fd7f_480c_91d1_36437da2a5b4.slice/crio-938612283edd0b94d1a3a919413f49fcd820b54f18eabbd3248ff69a35393bf9 WatchSource:0}: Error finding container 938612283edd0b94d1a3a919413f49fcd820b54f18eabbd3248ff69a35393bf9: Status 404 returned error can't find the container with id 938612283edd0b94d1a3a919413f49fcd820b54f18eabbd3248ff69a35393bf9 Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.106215 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-dgh2j"] Mar 18 19:22:35 crc kubenswrapper[5008]: W0318 19:22:35.111958 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb89a23_7c7a_4e77_8f39_85e59d4a9d46.slice/crio-bba7b29e3e99f56ef7cf92da7938e09a767846b57b7ebc93c405c3c7b0673043 WatchSource:0}: Error finding container bba7b29e3e99f56ef7cf92da7938e09a767846b57b7ebc93c405c3c7b0673043: Status 404 returned error can't find the container with id bba7b29e3e99f56ef7cf92da7938e09a767846b57b7ebc93c405c3c7b0673043 Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.399396 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.400454 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.402422 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.402737 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mp58x" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.402882 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.403588 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.403655 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.422000 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.537454 5008 generic.go:334] "Generic (PLEG): container finished" podID="3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" containerID="188e373487b78cf1f4eabd71a3966192dda0ee206008cc2a18ba71089ef9628d" exitCode=0 Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.537795 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" event={"ID":"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46","Type":"ContainerDied","Data":"188e373487b78cf1f4eabd71a3966192dda0ee206008cc2a18ba71089ef9628d"} Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.537826 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" event={"ID":"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46","Type":"ContainerStarted","Data":"bba7b29e3e99f56ef7cf92da7938e09a767846b57b7ebc93c405c3c7b0673043"} Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.542000 5008 generic.go:334] "Generic (PLEG): container finished" podID="ddd86f8b-3e7b-419c-ac0c-77bda06b352e" containerID="2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96" exitCode=0 Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.542082 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" event={"ID":"ddd86f8b-3e7b-419c-ac0c-77bda06b352e","Type":"ContainerDied","Data":"2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96"} Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.542108 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" event={"ID":"ddd86f8b-3e7b-419c-ac0c-77bda06b352e","Type":"ContainerStarted","Data":"a1ca72566f9ccaa6c67114da988940db268f39760c6a30e55a402e6dda93759e"} Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.544620 5008 generic.go:334] "Generic (PLEG): container finished" podID="300542d1-fd7f-480c-91d1-36437da2a5b4" containerID="f2c9b83de1323bc714a2b234fe5addcb4153f2896d23b9d4fc5f5d693775d000" exitCode=0 Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.544690 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-b4lws" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.544731 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" event={"ID":"300542d1-fd7f-480c-91d1-36437da2a5b4","Type":"ContainerDied","Data":"f2c9b83de1323bc714a2b234fe5addcb4153f2896d23b9d4fc5f5d693775d000"} Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.545120 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" event={"ID":"300542d1-fd7f-480c-91d1-36437da2a5b4","Type":"ContainerStarted","Data":"938612283edd0b94d1a3a919413f49fcd820b54f18eabbd3248ff69a35393bf9"} Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.596504 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.596562 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dftd7\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-kube-api-access-dftd7\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.596585 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.596606 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6c00d8d-362a-4cb1-8332-7507fa863003-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.596631 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6c00d8d-362a-4cb1-8332-7507fa863003-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.596647 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.596690 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.596728 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.596758 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.697886 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.697960 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.698008 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.698063 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.698083 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dftd7\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-kube-api-access-dftd7\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.698102 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.698121 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6c00d8d-362a-4cb1-8332-7507fa863003-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.698142 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6c00d8d-362a-4cb1-8332-7507fa863003-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.698160 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.700416 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.700957 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.701188 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.702221 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.704713 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.705997 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6c00d8d-362a-4cb1-8332-7507fa863003-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.714163 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6c00d8d-362a-4cb1-8332-7507fa863003-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.714773 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.715587 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.716591 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.716628 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2270bcf2b6cc50749fc4da73a03f1ee9c9c2e70bdb8d1c03e859f2f39eda1765/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.737154 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.737316 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.737499 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sff5b" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.737574 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.737651 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.764876 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dftd7\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-kube-api-access-dftd7\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.800271 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.808006 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.870880 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-b4lws"] Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.879522 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-b4lws"] Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.902687 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddgw\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-kube-api-access-tddgw\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.902804 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.902855 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.902876 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/343ee353-693d-44a8-9473-805ad741f837-pod-info\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.902910 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.902930 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-server-conf\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.902961 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.902987 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/343ee353-693d-44a8-9473-805ad741f837-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:35 crc kubenswrapper[5008]: I0318 19:22:35.903013 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.004348 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddgw\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-kube-api-access-tddgw\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.004441 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.004471 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.004491 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/343ee353-693d-44a8-9473-805ad741f837-pod-info\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.004526 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.004548 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-server-conf\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.004601 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.004630 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/343ee353-693d-44a8-9473-805ad741f837-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.004662 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.006331 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.006650 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-server-conf\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.006711 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.006908 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.012247 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/343ee353-693d-44a8-9473-805ad741f837-pod-info\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.012271 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/343ee353-693d-44a8-9473-805ad741f837-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.013700 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.013739 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7545baf37603d6959806d2dfe7b0b7d6eb4ba700ba63c9c8a9207fd98584881d/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.018263 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.025128 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddgw\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-kube-api-access-tddgw\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.044373 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.050386 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") pod \"rabbitmq-server-0\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.064905 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.100958 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.206980 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqvmd\" (UniqueName: \"kubernetes.io/projected/300542d1-fd7f-480c-91d1-36437da2a5b4-kube-api-access-wqvmd\") pod \"300542d1-fd7f-480c-91d1-36437da2a5b4\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.207051 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-dns-svc\") pod \"300542d1-fd7f-480c-91d1-36437da2a5b4\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.207111 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-config\") pod \"300542d1-fd7f-480c-91d1-36437da2a5b4\" (UID: \"300542d1-fd7f-480c-91d1-36437da2a5b4\") " Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.210049 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5db3c58-8cf6-4ddc-90e4-c3602f536570" path="/var/lib/kubelet/pods/c5db3c58-8cf6-4ddc-90e4-c3602f536570/volumes" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.212318 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300542d1-fd7f-480c-91d1-36437da2a5b4-kube-api-access-wqvmd" (OuterVolumeSpecName: "kube-api-access-wqvmd") pod "300542d1-fd7f-480c-91d1-36437da2a5b4" (UID: "300542d1-fd7f-480c-91d1-36437da2a5b4"). InnerVolumeSpecName "kube-api-access-wqvmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.231868 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "300542d1-fd7f-480c-91d1-36437da2a5b4" (UID: "300542d1-fd7f-480c-91d1-36437da2a5b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.235075 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-config" (OuterVolumeSpecName: "config") pod "300542d1-fd7f-480c-91d1-36437da2a5b4" (UID: "300542d1-fd7f-480c-91d1-36437da2a5b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.308968 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqvmd\" (UniqueName: \"kubernetes.io/projected/300542d1-fd7f-480c-91d1-36437da2a5b4-kube-api-access-wqvmd\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.310005 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.310027 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/300542d1-fd7f-480c-91d1-36437da2a5b4-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.500689 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:22:36 crc kubenswrapper[5008]: W0318 19:22:36.503740 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c00d8d_362a_4cb1_8332_7507fa863003.slice/crio-1459a1f8c19f1d04228477d108dc8841529b2877e704271eab1ca9f3a7d026f7 WatchSource:0}: Error finding container 1459a1f8c19f1d04228477d108dc8841529b2877e704271eab1ca9f3a7d026f7: Status 404 returned error can't find the container with id 1459a1f8c19f1d04228477d108dc8841529b2877e704271eab1ca9f3a7d026f7 Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.552713 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" event={"ID":"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46","Type":"ContainerStarted","Data":"f43f35a583ebc1f425ba8573f1ad653169cc84ec623103eed534ca4ff4a8d85c"} Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.552864 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.554642 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6c00d8d-362a-4cb1-8332-7507fa863003","Type":"ContainerStarted","Data":"1459a1f8c19f1d04228477d108dc8841529b2877e704271eab1ca9f3a7d026f7"} Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.557209 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" event={"ID":"ddd86f8b-3e7b-419c-ac0c-77bda06b352e","Type":"ContainerStarted","Data":"f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9"} Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.557240 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.558708 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" event={"ID":"300542d1-fd7f-480c-91d1-36437da2a5b4","Type":"ContainerDied","Data":"938612283edd0b94d1a3a919413f49fcd820b54f18eabbd3248ff69a35393bf9"} Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.558751 5008 scope.go:117] "RemoveContainer" containerID="f2c9b83de1323bc714a2b234fe5addcb4153f2896d23b9d4fc5f5d693775d000" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.558899 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbf56dd9-hl49w" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.569389 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.582122 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" podStartSLOduration=2.582107117 podStartE2EDuration="2.582107117s" podCreationTimestamp="2026-03-18 19:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:22:36.576180464 +0000 UTC m=+4813.095653543" watchObservedRunningTime="2026-03-18 19:22:36.582107117 +0000 UTC m=+4813.101580186" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.597692 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" podStartSLOduration=2.597671057 podStartE2EDuration="2.597671057s" podCreationTimestamp="2026-03-18 19:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:22:36.594620958 +0000 UTC m=+4813.114094037" watchObservedRunningTime="2026-03-18 19:22:36.597671057 +0000 UTC m=+4813.117144136" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.705639 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-hl49w"] Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.711319 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cfbf56dd9-hl49w"] Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.992238 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 19:22:36 crc kubenswrapper[5008]: E0318 19:22:36.997710 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300542d1-fd7f-480c-91d1-36437da2a5b4" containerName="init" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.997752 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="300542d1-fd7f-480c-91d1-36437da2a5b4" containerName="init" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.998040 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="300542d1-fd7f-480c-91d1-36437da2a5b4" containerName="init" Mar 18 19:22:36 crc kubenswrapper[5008]: I0318 19:22:36.999012 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.002113 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.002231 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-kxlnc" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.002476 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.003133 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.031890 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.062338 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.127185 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/20b29007-4fe1-4277-adf3-5ad1fe710130-config-data-default\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.127250 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/20b29007-4fe1-4277-adf3-5ad1fe710130-kolla-config\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.127270 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b29007-4fe1-4277-adf3-5ad1fe710130-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.127438 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20b29007-4fe1-4277-adf3-5ad1fe710130-operator-scripts\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.127484 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/20b29007-4fe1-4277-adf3-5ad1fe710130-config-data-generated\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.127514 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b29007-4fe1-4277-adf3-5ad1fe710130-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.127641 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-348fc842-e966-4fa9-90ed-1848b2c35272\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-348fc842-e966-4fa9-90ed-1848b2c35272\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.127667 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk59z\" (UniqueName: \"kubernetes.io/projected/20b29007-4fe1-4277-adf3-5ad1fe710130-kube-api-access-nk59z\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.132946 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.133881 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.135325 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.135516 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-2tsck" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.146991 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.229094 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef72875b-b932-47d0-8bf9-ba4a63d47fb3-config-data\") pod \"memcached-0\" (UID: \"ef72875b-b932-47d0-8bf9-ba4a63d47fb3\") " pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.230071 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20b29007-4fe1-4277-adf3-5ad1fe710130-operator-scripts\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.230181 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/20b29007-4fe1-4277-adf3-5ad1fe710130-config-data-generated\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.230325 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b29007-4fe1-4277-adf3-5ad1fe710130-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.230592 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xqs\" (UniqueName: \"kubernetes.io/projected/ef72875b-b932-47d0-8bf9-ba4a63d47fb3-kube-api-access-l6xqs\") pod \"memcached-0\" (UID: \"ef72875b-b932-47d0-8bf9-ba4a63d47fb3\") " pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.230727 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-348fc842-e966-4fa9-90ed-1848b2c35272\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-348fc842-e966-4fa9-90ed-1848b2c35272\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.230816 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk59z\" (UniqueName: \"kubernetes.io/projected/20b29007-4fe1-4277-adf3-5ad1fe710130-kube-api-access-nk59z\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.230919 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/20b29007-4fe1-4277-adf3-5ad1fe710130-config-data-default\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.231014 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/20b29007-4fe1-4277-adf3-5ad1fe710130-kolla-config\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.231086 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b29007-4fe1-4277-adf3-5ad1fe710130-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.231189 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef72875b-b932-47d0-8bf9-ba4a63d47fb3-kolla-config\") pod \"memcached-0\" (UID: \"ef72875b-b932-47d0-8bf9-ba4a63d47fb3\") " pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.231487 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/20b29007-4fe1-4277-adf3-5ad1fe710130-config-data-generated\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.232272 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20b29007-4fe1-4277-adf3-5ad1fe710130-operator-scripts\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.232467 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/20b29007-4fe1-4277-adf3-5ad1fe710130-kolla-config\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.232923 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/20b29007-4fe1-4277-adf3-5ad1fe710130-config-data-default\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.236482 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b29007-4fe1-4277-adf3-5ad1fe710130-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.236544 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.236663 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-348fc842-e966-4fa9-90ed-1848b2c35272\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-348fc842-e966-4fa9-90ed-1848b2c35272\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fc34224cd139962ec8137510f1ba387c7fab03ba6fde5c088f37860ba24230ae/globalmount\"" pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.239875 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b29007-4fe1-4277-adf3-5ad1fe710130-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.259647 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk59z\" (UniqueName: \"kubernetes.io/projected/20b29007-4fe1-4277-adf3-5ad1fe710130-kube-api-access-nk59z\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.286427 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-348fc842-e966-4fa9-90ed-1848b2c35272\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-348fc842-e966-4fa9-90ed-1848b2c35272\") pod \"openstack-galera-0\" (UID: \"20b29007-4fe1-4277-adf3-5ad1fe710130\") " pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.326871 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.335382 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef72875b-b932-47d0-8bf9-ba4a63d47fb3-config-data\") pod \"memcached-0\" (UID: \"ef72875b-b932-47d0-8bf9-ba4a63d47fb3\") " pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.335470 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xqs\" (UniqueName: \"kubernetes.io/projected/ef72875b-b932-47d0-8bf9-ba4a63d47fb3-kube-api-access-l6xqs\") pod \"memcached-0\" (UID: \"ef72875b-b932-47d0-8bf9-ba4a63d47fb3\") " pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.336803 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef72875b-b932-47d0-8bf9-ba4a63d47fb3-config-data\") pod \"memcached-0\" (UID: \"ef72875b-b932-47d0-8bf9-ba4a63d47fb3\") " pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.342036 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef72875b-b932-47d0-8bf9-ba4a63d47fb3-kolla-config\") pod \"memcached-0\" (UID: \"ef72875b-b932-47d0-8bf9-ba4a63d47fb3\") " pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.342952 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef72875b-b932-47d0-8bf9-ba4a63d47fb3-kolla-config\") pod \"memcached-0\" (UID: \"ef72875b-b932-47d0-8bf9-ba4a63d47fb3\") " pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.450281 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xqs\" (UniqueName: \"kubernetes.io/projected/ef72875b-b932-47d0-8bf9-ba4a63d47fb3-kube-api-access-l6xqs\") pod \"memcached-0\" (UID: \"ef72875b-b932-47d0-8bf9-ba4a63d47fb3\") " pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.450685 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.574289 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"343ee353-693d-44a8-9473-805ad741f837","Type":"ContainerStarted","Data":"c97a5b3f00691dcc3b4089f958911f79976efdd12829e1080a863778aa980d64"} Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.846425 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 19:22:37 crc kubenswrapper[5008]: I0318 19:22:37.918125 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.209858 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300542d1-fd7f-480c-91d1-36437da2a5b4" path="/var/lib/kubelet/pods/300542d1-fd7f-480c-91d1-36437da2a5b4/volumes" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.307374 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.309147 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.311424 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.311690 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.311700 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-crhbz" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.319643 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.320754 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.462332 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.462371 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5zss\" (UniqueName: \"kubernetes.io/projected/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-kube-api-access-h5zss\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.462403 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.462434 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.462472 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b2857c26-7d06-4f6b-bc64-060c247fb05d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b2857c26-7d06-4f6b-bc64-060c247fb05d\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.462488 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.462513 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.462544 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.564094 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.564188 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.564220 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5zss\" (UniqueName: \"kubernetes.io/projected/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-kube-api-access-h5zss\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.564250 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.564283 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.564327 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b2857c26-7d06-4f6b-bc64-060c247fb05d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b2857c26-7d06-4f6b-bc64-060c247fb05d\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.564353 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.564383 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.565335 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.566313 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.566739 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.567194 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.567228 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b2857c26-7d06-4f6b-bc64-060c247fb05d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b2857c26-7d06-4f6b-bc64-060c247fb05d\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ca6eea3112fa17a624de609f96240c5bbb17d41496d82fdfb27a5289943603b4/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.567455 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.577722 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.577733 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.588741 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5zss\" (UniqueName: \"kubernetes.io/projected/2eb4ee4c-a64a-4d36-9b6d-a3386cb30917-kube-api-access-h5zss\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.589394 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"20b29007-4fe1-4277-adf3-5ad1fe710130","Type":"ContainerStarted","Data":"945c8472baf7b285e01e6423a3184e46b685a7cd3cac7bfe016d65979066bcb0"} Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.591626 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"20b29007-4fe1-4277-adf3-5ad1fe710130","Type":"ContainerStarted","Data":"cb71b5a6b7f66984de67f401a86ba016a1692b03037b885ec27bc006d72b110f"} Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.591774 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6c00d8d-362a-4cb1-8332-7507fa863003","Type":"ContainerStarted","Data":"cef769c0644db0d2f2ca0b34e22e5338120aaf2c5e915a5948d61ea7c2098d7b"} Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.591928 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"343ee353-693d-44a8-9473-805ad741f837","Type":"ContainerStarted","Data":"20e88dd6639973a378d733dbd06150698bfb01dacf6f820c3b4336077dc66fef"} Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.593485 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ef72875b-b932-47d0-8bf9-ba4a63d47fb3","Type":"ContainerStarted","Data":"76f4532315fa3bc0d31fb17a038e6d13887b2d1d276f78ce41722406e1e03d0c"} Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.593526 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ef72875b-b932-47d0-8bf9-ba4a63d47fb3","Type":"ContainerStarted","Data":"59c90b04446596926728eb23921ac3d5270a0e15859babd86d3785a2c497cdb8"} Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.594161 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.621791 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b2857c26-7d06-4f6b-bc64-060c247fb05d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b2857c26-7d06-4f6b-bc64-060c247fb05d\") pod \"openstack-cell1-galera-0\" (UID: \"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917\") " pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.695134 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.69510776 podStartE2EDuration="1.69510776s" podCreationTimestamp="2026-03-18 19:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:22:38.688368297 +0000 UTC m=+4815.207841386" watchObservedRunningTime="2026-03-18 19:22:38.69510776 +0000 UTC m=+4815.214580839" Mar 18 19:22:38 crc kubenswrapper[5008]: I0318 19:22:38.697907 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:39 crc kubenswrapper[5008]: I0318 19:22:39.144953 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 19:22:39 crc kubenswrapper[5008]: I0318 19:22:39.603641 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917","Type":"ContainerStarted","Data":"927d5df98dfe57dc5fb2120e8da9813ef271f4421aa4c6eb4d0cf7b428e6e1ba"} Mar 18 19:22:39 crc kubenswrapper[5008]: I0318 19:22:39.603720 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917","Type":"ContainerStarted","Data":"1eb5b60abddca06855fd05b8aad6df183209e81ce4a59bccd31f8951c1f7851e"} Mar 18 19:22:42 crc kubenswrapper[5008]: I0318 19:22:42.627532 5008 generic.go:334] "Generic (PLEG): container finished" podID="20b29007-4fe1-4277-adf3-5ad1fe710130" containerID="945c8472baf7b285e01e6423a3184e46b685a7cd3cac7bfe016d65979066bcb0" exitCode=0 Mar 18 19:22:42 crc kubenswrapper[5008]: I0318 19:22:42.627635 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"20b29007-4fe1-4277-adf3-5ad1fe710130","Type":"ContainerDied","Data":"945c8472baf7b285e01e6423a3184e46b685a7cd3cac7bfe016d65979066bcb0"} Mar 18 19:22:43 crc kubenswrapper[5008]: I0318 19:22:43.641877 5008 generic.go:334] "Generic (PLEG): container finished" podID="2eb4ee4c-a64a-4d36-9b6d-a3386cb30917" containerID="927d5df98dfe57dc5fb2120e8da9813ef271f4421aa4c6eb4d0cf7b428e6e1ba" exitCode=0 Mar 18 19:22:43 crc kubenswrapper[5008]: I0318 19:22:43.642101 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917","Type":"ContainerDied","Data":"927d5df98dfe57dc5fb2120e8da9813ef271f4421aa4c6eb4d0cf7b428e6e1ba"} Mar 18 19:22:43 crc kubenswrapper[5008]: I0318 19:22:43.646250 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"20b29007-4fe1-4277-adf3-5ad1fe710130","Type":"ContainerStarted","Data":"89f8caa770b5ad5a04b32e85accbde01be1803c798bfd6f210441a6f975860d0"} Mar 18 19:22:43 crc kubenswrapper[5008]: I0318 19:22:43.711147 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.71112593 podStartE2EDuration="8.71112593s" podCreationTimestamp="2026-03-18 19:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:22:43.699997164 +0000 UTC m=+4820.219470273" watchObservedRunningTime="2026-03-18 19:22:43.71112593 +0000 UTC m=+4820.230599029" Mar 18 19:22:44 crc kubenswrapper[5008]: I0318 19:22:44.217661 5008 scope.go:117] "RemoveContainer" containerID="bf823b919175eb6cceede839b695ac5f814f6a4b7702733020481c9ed01b8495" Mar 18 19:22:44 crc kubenswrapper[5008]: I0318 19:22:44.443921 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:44 crc kubenswrapper[5008]: I0318 19:22:44.654969 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2eb4ee4c-a64a-4d36-9b6d-a3386cb30917","Type":"ContainerStarted","Data":"d1a4d3e8fedf5b3ed5bdbfd82d71a78bb1439c7e8a9890220617f99a999c2936"} Mar 18 19:22:44 crc kubenswrapper[5008]: I0318 19:22:44.901152 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:22:44 crc kubenswrapper[5008]: I0318 19:22:44.933451 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.933417837 podStartE2EDuration="7.933417837s" podCreationTimestamp="2026-03-18 19:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:22:44.673930713 +0000 UTC m=+4821.193403792" watchObservedRunningTime="2026-03-18 19:22:44.933417837 +0000 UTC m=+4821.452890956" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.013347 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-h7h9w"] Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.013547 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" podUID="ddd86f8b-3e7b-419c-ac0c-77bda06b352e" containerName="dnsmasq-dns" containerID="cri-o://f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9" gracePeriod=10 Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.433032 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.589168 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n94p6\" (UniqueName: \"kubernetes.io/projected/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-kube-api-access-n94p6\") pod \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\" (UID: \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\") " Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.589318 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-config\") pod \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\" (UID: \"ddd86f8b-3e7b-419c-ac0c-77bda06b352e\") " Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.594324 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-kube-api-access-n94p6" (OuterVolumeSpecName: "kube-api-access-n94p6") pod "ddd86f8b-3e7b-419c-ac0c-77bda06b352e" (UID: "ddd86f8b-3e7b-419c-ac0c-77bda06b352e"). InnerVolumeSpecName "kube-api-access-n94p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.632097 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-config" (OuterVolumeSpecName: "config") pod "ddd86f8b-3e7b-419c-ac0c-77bda06b352e" (UID: "ddd86f8b-3e7b-419c-ac0c-77bda06b352e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.662483 5008 generic.go:334] "Generic (PLEG): container finished" podID="ddd86f8b-3e7b-419c-ac0c-77bda06b352e" containerID="f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9" exitCode=0 Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.662532 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" event={"ID":"ddd86f8b-3e7b-419c-ac0c-77bda06b352e","Type":"ContainerDied","Data":"f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9"} Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.662536 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.662576 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-h7h9w" event={"ID":"ddd86f8b-3e7b-419c-ac0c-77bda06b352e","Type":"ContainerDied","Data":"a1ca72566f9ccaa6c67114da988940db268f39760c6a30e55a402e6dda93759e"} Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.662594 5008 scope.go:117] "RemoveContainer" containerID="f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.678744 5008 scope.go:117] "RemoveContainer" containerID="2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.690632 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n94p6\" (UniqueName: \"kubernetes.io/projected/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-kube-api-access-n94p6\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.690657 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd86f8b-3e7b-419c-ac0c-77bda06b352e-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.690675 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-h7h9w"] Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.695534 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-h7h9w"] Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.714590 5008 scope.go:117] "RemoveContainer" containerID="f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9" Mar 18 19:22:45 crc kubenswrapper[5008]: E0318 19:22:45.715041 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9\": container with ID starting with f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9 not found: ID does not exist" containerID="f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.715087 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9"} err="failed to get container status \"f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9\": rpc error: code = NotFound desc = could not find container \"f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9\": container with ID starting with f8cb36ab5a6bee1611986c3a2e504c5cadfa4036cfe17c7ad1a3ef5049e15bd9 not found: ID does not exist" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.715114 5008 scope.go:117] "RemoveContainer" containerID="2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96" Mar 18 19:22:45 crc kubenswrapper[5008]: E0318 19:22:45.715514 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96\": container with ID starting with 2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96 not found: ID does not exist" containerID="2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96" Mar 18 19:22:45 crc kubenswrapper[5008]: I0318 19:22:45.715572 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96"} err="failed to get container status \"2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96\": rpc error: code = NotFound desc = could not find container \"2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96\": container with ID starting with 2395167100d8c01e2cd9f2cde438af30b01c66912bc497e821e8b25f3ba38f96 not found: ID does not exist" Mar 18 19:22:46 crc kubenswrapper[5008]: I0318 19:22:46.213767 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd86f8b-3e7b-419c-ac0c-77bda06b352e" path="/var/lib/kubelet/pods/ddd86f8b-3e7b-419c-ac0c-77bda06b352e/volumes" Mar 18 19:22:47 crc kubenswrapper[5008]: I0318 19:22:47.327814 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 19:22:47 crc kubenswrapper[5008]: I0318 19:22:47.327940 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 19:22:47 crc kubenswrapper[5008]: I0318 19:22:47.401037 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 19:22:47 crc kubenswrapper[5008]: I0318 19:22:47.453839 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 19:22:47 crc kubenswrapper[5008]: I0318 19:22:47.753745 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 19:22:48 crc kubenswrapper[5008]: E0318 19:22:48.000635 5008 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.9:47422->38.102.83.9:40185: read tcp 38.102.83.9:47422->38.102.83.9:40185: read: connection reset by peer Mar 18 19:22:48 crc kubenswrapper[5008]: I0318 19:22:48.698709 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:48 crc kubenswrapper[5008]: I0318 19:22:48.698763 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:48 crc kubenswrapper[5008]: I0318 19:22:48.798413 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:49 crc kubenswrapper[5008]: I0318 19:22:49.769959 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 19:22:54 crc kubenswrapper[5008]: I0318 19:22:54.460419 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:22:54 crc kubenswrapper[5008]: I0318 19:22:54.461286 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:22:54 crc kubenswrapper[5008]: I0318 19:22:54.461373 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 19:22:54 crc kubenswrapper[5008]: I0318 19:22:54.462450 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d595231fbacf4e90ffb123dbddc3f5bb05b324fc9dae73ed4f109d00d75ea52"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:22:54 crc kubenswrapper[5008]: I0318 19:22:54.462605 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://4d595231fbacf4e90ffb123dbddc3f5bb05b324fc9dae73ed4f109d00d75ea52" gracePeriod=600 Mar 18 19:22:54 crc kubenswrapper[5008]: I0318 19:22:54.740983 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="4d595231fbacf4e90ffb123dbddc3f5bb05b324fc9dae73ed4f109d00d75ea52" exitCode=0 Mar 18 19:22:54 crc kubenswrapper[5008]: I0318 19:22:54.741027 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"4d595231fbacf4e90ffb123dbddc3f5bb05b324fc9dae73ed4f109d00d75ea52"} Mar 18 19:22:54 crc kubenswrapper[5008]: I0318 19:22:54.741454 5008 scope.go:117] "RemoveContainer" containerID="5807b1504501c92d9a80634bc9844861d31b87c30d339b3095a08d7deee22b86" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.751008 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7"} Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.820636 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tv6cx"] Mar 18 19:22:55 crc kubenswrapper[5008]: E0318 19:22:55.821055 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd86f8b-3e7b-419c-ac0c-77bda06b352e" containerName="dnsmasq-dns" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.821077 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd86f8b-3e7b-419c-ac0c-77bda06b352e" containerName="dnsmasq-dns" Mar 18 19:22:55 crc kubenswrapper[5008]: E0318 19:22:55.821102 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd86f8b-3e7b-419c-ac0c-77bda06b352e" containerName="init" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.821111 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd86f8b-3e7b-419c-ac0c-77bda06b352e" containerName="init" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.821308 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd86f8b-3e7b-419c-ac0c-77bda06b352e" containerName="dnsmasq-dns" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.821947 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tv6cx" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.825980 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.828713 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tv6cx"] Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.892077 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-operator-scripts\") pod \"root-account-create-update-tv6cx\" (UID: \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\") " pod="openstack/root-account-create-update-tv6cx" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.893111 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6tjz\" (UniqueName: \"kubernetes.io/projected/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-kube-api-access-b6tjz\") pod \"root-account-create-update-tv6cx\" (UID: \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\") " pod="openstack/root-account-create-update-tv6cx" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.994705 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6tjz\" (UniqueName: \"kubernetes.io/projected/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-kube-api-access-b6tjz\") pod \"root-account-create-update-tv6cx\" (UID: \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\") " pod="openstack/root-account-create-update-tv6cx" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.994873 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-operator-scripts\") pod \"root-account-create-update-tv6cx\" (UID: \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\") " pod="openstack/root-account-create-update-tv6cx" Mar 18 19:22:55 crc kubenswrapper[5008]: I0318 19:22:55.995601 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-operator-scripts\") pod \"root-account-create-update-tv6cx\" (UID: \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\") " pod="openstack/root-account-create-update-tv6cx" Mar 18 19:22:56 crc kubenswrapper[5008]: I0318 19:22:56.020123 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6tjz\" (UniqueName: \"kubernetes.io/projected/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-kube-api-access-b6tjz\") pod \"root-account-create-update-tv6cx\" (UID: \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\") " pod="openstack/root-account-create-update-tv6cx" Mar 18 19:22:56 crc kubenswrapper[5008]: I0318 19:22:56.143108 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tv6cx" Mar 18 19:22:56 crc kubenswrapper[5008]: I0318 19:22:56.625248 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tv6cx"] Mar 18 19:22:56 crc kubenswrapper[5008]: I0318 19:22:56.757979 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tv6cx" event={"ID":"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2","Type":"ContainerStarted","Data":"57d54f80248bed56fd47e62fa8f14f0d7efb15bc880b81b2596f3130c8414ba7"} Mar 18 19:22:57 crc kubenswrapper[5008]: I0318 19:22:57.767336 5008 generic.go:334] "Generic (PLEG): container finished" podID="2f7fa759-75bd-4bb0-9c69-ad9fe48382d2" containerID="2a2ddbff83bf852ae8f892d67c0a07b59310a63c97ac11a8397fab68c1ad68e7" exitCode=0 Mar 18 19:22:57 crc kubenswrapper[5008]: I0318 19:22:57.767407 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tv6cx" event={"ID":"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2","Type":"ContainerDied","Data":"2a2ddbff83bf852ae8f892d67c0a07b59310a63c97ac11a8397fab68c1ad68e7"} Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.154623 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tv6cx" Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.247593 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-operator-scripts\") pod \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\" (UID: \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\") " Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.247745 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6tjz\" (UniqueName: \"kubernetes.io/projected/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-kube-api-access-b6tjz\") pod \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\" (UID: \"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2\") " Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.248364 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f7fa759-75bd-4bb0-9c69-ad9fe48382d2" (UID: "2f7fa759-75bd-4bb0-9c69-ad9fe48382d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.255754 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-kube-api-access-b6tjz" (OuterVolumeSpecName: "kube-api-access-b6tjz") pod "2f7fa759-75bd-4bb0-9c69-ad9fe48382d2" (UID: "2f7fa759-75bd-4bb0-9c69-ad9fe48382d2"). InnerVolumeSpecName "kube-api-access-b6tjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.349746 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.349799 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6tjz\" (UniqueName: \"kubernetes.io/projected/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2-kube-api-access-b6tjz\") on node \"crc\" DevicePath \"\"" Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.792634 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tv6cx" event={"ID":"2f7fa759-75bd-4bb0-9c69-ad9fe48382d2","Type":"ContainerDied","Data":"57d54f80248bed56fd47e62fa8f14f0d7efb15bc880b81b2596f3130c8414ba7"} Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.792697 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57d54f80248bed56fd47e62fa8f14f0d7efb15bc880b81b2596f3130c8414ba7" Mar 18 19:22:59 crc kubenswrapper[5008]: I0318 19:22:59.792736 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tv6cx" Mar 18 19:23:02 crc kubenswrapper[5008]: I0318 19:23:02.298695 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tv6cx"] Mar 18 19:23:02 crc kubenswrapper[5008]: I0318 19:23:02.305338 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tv6cx"] Mar 18 19:23:04 crc kubenswrapper[5008]: I0318 19:23:04.209017 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f7fa759-75bd-4bb0-9c69-ad9fe48382d2" path="/var/lib/kubelet/pods/2f7fa759-75bd-4bb0-9c69-ad9fe48382d2/volumes" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.326155 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-q986z"] Mar 18 19:23:07 crc kubenswrapper[5008]: E0318 19:23:07.326724 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7fa759-75bd-4bb0-9c69-ad9fe48382d2" containerName="mariadb-account-create-update" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.326767 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7fa759-75bd-4bb0-9c69-ad9fe48382d2" containerName="mariadb-account-create-update" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.327127 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f7fa759-75bd-4bb0-9c69-ad9fe48382d2" containerName="mariadb-account-create-update" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.328214 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q986z" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.332654 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.341161 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q986z"] Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.420080 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35246fdb-28bc-4e43-af93-b4171a374147-operator-scripts\") pod \"root-account-create-update-q986z\" (UID: \"35246fdb-28bc-4e43-af93-b4171a374147\") " pod="openstack/root-account-create-update-q986z" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.420488 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj566\" (UniqueName: \"kubernetes.io/projected/35246fdb-28bc-4e43-af93-b4171a374147-kube-api-access-xj566\") pod \"root-account-create-update-q986z\" (UID: \"35246fdb-28bc-4e43-af93-b4171a374147\") " pod="openstack/root-account-create-update-q986z" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.521338 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj566\" (UniqueName: \"kubernetes.io/projected/35246fdb-28bc-4e43-af93-b4171a374147-kube-api-access-xj566\") pod \"root-account-create-update-q986z\" (UID: \"35246fdb-28bc-4e43-af93-b4171a374147\") " pod="openstack/root-account-create-update-q986z" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.521419 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35246fdb-28bc-4e43-af93-b4171a374147-operator-scripts\") pod \"root-account-create-update-q986z\" (UID: \"35246fdb-28bc-4e43-af93-b4171a374147\") " pod="openstack/root-account-create-update-q986z" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.522577 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35246fdb-28bc-4e43-af93-b4171a374147-operator-scripts\") pod \"root-account-create-update-q986z\" (UID: \"35246fdb-28bc-4e43-af93-b4171a374147\") " pod="openstack/root-account-create-update-q986z" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.544980 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj566\" (UniqueName: \"kubernetes.io/projected/35246fdb-28bc-4e43-af93-b4171a374147-kube-api-access-xj566\") pod \"root-account-create-update-q986z\" (UID: \"35246fdb-28bc-4e43-af93-b4171a374147\") " pod="openstack/root-account-create-update-q986z" Mar 18 19:23:07 crc kubenswrapper[5008]: I0318 19:23:07.666813 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q986z" Mar 18 19:23:08 crc kubenswrapper[5008]: I0318 19:23:08.157204 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q986z"] Mar 18 19:23:08 crc kubenswrapper[5008]: W0318 19:23:08.171143 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35246fdb_28bc_4e43_af93_b4171a374147.slice/crio-956da824240950d47112330bfe73f724f592e7ad0d6ead43651a21e94ed10e09 WatchSource:0}: Error finding container 956da824240950d47112330bfe73f724f592e7ad0d6ead43651a21e94ed10e09: Status 404 returned error can't find the container with id 956da824240950d47112330bfe73f724f592e7ad0d6ead43651a21e94ed10e09 Mar 18 19:23:08 crc kubenswrapper[5008]: I0318 19:23:08.873794 5008 generic.go:334] "Generic (PLEG): container finished" podID="35246fdb-28bc-4e43-af93-b4171a374147" containerID="3a4dc2f4dea0d98b56e5a3ba1287d14dd64b851dcbc4f5cf708b04cf12ff3c4c" exitCode=0 Mar 18 19:23:08 crc kubenswrapper[5008]: I0318 19:23:08.873902 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q986z" event={"ID":"35246fdb-28bc-4e43-af93-b4171a374147","Type":"ContainerDied","Data":"3a4dc2f4dea0d98b56e5a3ba1287d14dd64b851dcbc4f5cf708b04cf12ff3c4c"} Mar 18 19:23:08 crc kubenswrapper[5008]: I0318 19:23:08.874132 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q986z" event={"ID":"35246fdb-28bc-4e43-af93-b4171a374147","Type":"ContainerStarted","Data":"956da824240950d47112330bfe73f724f592e7ad0d6ead43651a21e94ed10e09"} Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.232585 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q986z" Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.364340 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj566\" (UniqueName: \"kubernetes.io/projected/35246fdb-28bc-4e43-af93-b4171a374147-kube-api-access-xj566\") pod \"35246fdb-28bc-4e43-af93-b4171a374147\" (UID: \"35246fdb-28bc-4e43-af93-b4171a374147\") " Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.364458 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35246fdb-28bc-4e43-af93-b4171a374147-operator-scripts\") pod \"35246fdb-28bc-4e43-af93-b4171a374147\" (UID: \"35246fdb-28bc-4e43-af93-b4171a374147\") " Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.365485 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35246fdb-28bc-4e43-af93-b4171a374147-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35246fdb-28bc-4e43-af93-b4171a374147" (UID: "35246fdb-28bc-4e43-af93-b4171a374147"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.373883 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35246fdb-28bc-4e43-af93-b4171a374147-kube-api-access-xj566" (OuterVolumeSpecName: "kube-api-access-xj566") pod "35246fdb-28bc-4e43-af93-b4171a374147" (UID: "35246fdb-28bc-4e43-af93-b4171a374147"). InnerVolumeSpecName "kube-api-access-xj566". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.466670 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj566\" (UniqueName: \"kubernetes.io/projected/35246fdb-28bc-4e43-af93-b4171a374147-kube-api-access-xj566\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.466922 5008 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35246fdb-28bc-4e43-af93-b4171a374147-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.893265 5008 generic.go:334] "Generic (PLEG): container finished" podID="f6c00d8d-362a-4cb1-8332-7507fa863003" containerID="cef769c0644db0d2f2ca0b34e22e5338120aaf2c5e915a5948d61ea7c2098d7b" exitCode=0 Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.893352 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6c00d8d-362a-4cb1-8332-7507fa863003","Type":"ContainerDied","Data":"cef769c0644db0d2f2ca0b34e22e5338120aaf2c5e915a5948d61ea7c2098d7b"} Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.895997 5008 generic.go:334] "Generic (PLEG): container finished" podID="343ee353-693d-44a8-9473-805ad741f837" containerID="20e88dd6639973a378d733dbd06150698bfb01dacf6f820c3b4336077dc66fef" exitCode=0 Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.896156 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"343ee353-693d-44a8-9473-805ad741f837","Type":"ContainerDied","Data":"20e88dd6639973a378d733dbd06150698bfb01dacf6f820c3b4336077dc66fef"} Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.900774 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q986z" event={"ID":"35246fdb-28bc-4e43-af93-b4171a374147","Type":"ContainerDied","Data":"956da824240950d47112330bfe73f724f592e7ad0d6ead43651a21e94ed10e09"} Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.900830 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="956da824240950d47112330bfe73f724f592e7ad0d6ead43651a21e94ed10e09" Mar 18 19:23:10 crc kubenswrapper[5008]: I0318 19:23:10.900897 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q986z" Mar 18 19:23:11 crc kubenswrapper[5008]: I0318 19:23:11.909270 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"343ee353-693d-44a8-9473-805ad741f837","Type":"ContainerStarted","Data":"c35fc69a569ca83654762edd7ad8737758df62058cb9500f9b7bd75fa8ca93b4"} Mar 18 19:23:11 crc kubenswrapper[5008]: I0318 19:23:11.909939 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 19:23:11 crc kubenswrapper[5008]: I0318 19:23:11.911078 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6c00d8d-362a-4cb1-8332-7507fa863003","Type":"ContainerStarted","Data":"a19c446fa241e94f1615222bf3d5fca8c7643db4cd5a8957e2a102a8a73b03fa"} Mar 18 19:23:11 crc kubenswrapper[5008]: I0318 19:23:11.911244 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:11 crc kubenswrapper[5008]: I0318 19:23:11.938345 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.938327646 podStartE2EDuration="37.938327646s" podCreationTimestamp="2026-03-18 19:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:23:11.938119751 +0000 UTC m=+4848.457592860" watchObservedRunningTime="2026-03-18 19:23:11.938327646 +0000 UTC m=+4848.457800725" Mar 18 19:23:11 crc kubenswrapper[5008]: I0318 19:23:11.980232 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.980209863 podStartE2EDuration="37.980209863s" podCreationTimestamp="2026-03-18 19:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:23:11.973231124 +0000 UTC m=+4848.492704213" watchObservedRunningTime="2026-03-18 19:23:11.980209863 +0000 UTC m=+4848.499682942" Mar 18 19:23:26 crc kubenswrapper[5008]: I0318 19:23:26.069697 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:26 crc kubenswrapper[5008]: I0318 19:23:26.107160 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.578035 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-v8j2b"] Mar 18 19:23:32 crc kubenswrapper[5008]: E0318 19:23:32.578764 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35246fdb-28bc-4e43-af93-b4171a374147" containerName="mariadb-account-create-update" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.578777 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="35246fdb-28bc-4e43-af93-b4171a374147" containerName="mariadb-account-create-update" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.578917 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="35246fdb-28bc-4e43-af93-b4171a374147" containerName="mariadb-account-create-update" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.579709 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.595406 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-v8j2b"] Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.726527 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pqgn\" (UniqueName: \"kubernetes.io/projected/190656d4-52a0-4b74-9472-d74daeb2d2be-kube-api-access-4pqgn\") pod \"dnsmasq-dns-684c864bc9-v8j2b\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.726900 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-config\") pod \"dnsmasq-dns-684c864bc9-v8j2b\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.726994 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-dns-svc\") pod \"dnsmasq-dns-684c864bc9-v8j2b\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.828930 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pqgn\" (UniqueName: \"kubernetes.io/projected/190656d4-52a0-4b74-9472-d74daeb2d2be-kube-api-access-4pqgn\") pod \"dnsmasq-dns-684c864bc9-v8j2b\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.829355 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-config\") pod \"dnsmasq-dns-684c864bc9-v8j2b\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.829654 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-dns-svc\") pod \"dnsmasq-dns-684c864bc9-v8j2b\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.830573 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-dns-svc\") pod \"dnsmasq-dns-684c864bc9-v8j2b\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.831253 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-config\") pod \"dnsmasq-dns-684c864bc9-v8j2b\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.863711 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pqgn\" (UniqueName: \"kubernetes.io/projected/190656d4-52a0-4b74-9472-d74daeb2d2be-kube-api-access-4pqgn\") pod \"dnsmasq-dns-684c864bc9-v8j2b\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:32 crc kubenswrapper[5008]: I0318 19:23:32.897456 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:33 crc kubenswrapper[5008]: I0318 19:23:33.139783 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-v8j2b"] Mar 18 19:23:33 crc kubenswrapper[5008]: W0318 19:23:33.149035 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod190656d4_52a0_4b74_9472_d74daeb2d2be.slice/crio-e077ce6e1f6a6e08b776407c49e8e938752d354b6c74f0d6af105e56c1ef013f WatchSource:0}: Error finding container e077ce6e1f6a6e08b776407c49e8e938752d354b6c74f0d6af105e56c1ef013f: Status 404 returned error can't find the container with id e077ce6e1f6a6e08b776407c49e8e938752d354b6c74f0d6af105e56c1ef013f Mar 18 19:23:33 crc kubenswrapper[5008]: I0318 19:23:33.905480 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:23:33 crc kubenswrapper[5008]: I0318 19:23:33.991182 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:23:34 crc kubenswrapper[5008]: I0318 19:23:34.143710 5008 generic.go:334] "Generic (PLEG): container finished" podID="190656d4-52a0-4b74-9472-d74daeb2d2be" containerID="5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d" exitCode=0 Mar 18 19:23:34 crc kubenswrapper[5008]: I0318 19:23:34.143780 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" event={"ID":"190656d4-52a0-4b74-9472-d74daeb2d2be","Type":"ContainerDied","Data":"5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d"} Mar 18 19:23:34 crc kubenswrapper[5008]: I0318 19:23:34.143808 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" event={"ID":"190656d4-52a0-4b74-9472-d74daeb2d2be","Type":"ContainerStarted","Data":"e077ce6e1f6a6e08b776407c49e8e938752d354b6c74f0d6af105e56c1ef013f"} Mar 18 19:23:35 crc kubenswrapper[5008]: I0318 19:23:35.154854 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" event={"ID":"190656d4-52a0-4b74-9472-d74daeb2d2be","Type":"ContainerStarted","Data":"133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500"} Mar 18 19:23:35 crc kubenswrapper[5008]: I0318 19:23:35.155019 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:35 crc kubenswrapper[5008]: I0318 19:23:35.180808 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" podStartSLOduration=3.180794264 podStartE2EDuration="3.180794264s" podCreationTimestamp="2026-03-18 19:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:23:35.178905555 +0000 UTC m=+4871.698378634" watchObservedRunningTime="2026-03-18 19:23:35.180794264 +0000 UTC m=+4871.700267343" Mar 18 19:23:35 crc kubenswrapper[5008]: I0318 19:23:35.696360 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="343ee353-693d-44a8-9473-805ad741f837" containerName="rabbitmq" containerID="cri-o://c35fc69a569ca83654762edd7ad8737758df62058cb9500f9b7bd75fa8ca93b4" gracePeriod=604799 Mar 18 19:23:35 crc kubenswrapper[5008]: I0318 19:23:35.767063 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f6c00d8d-362a-4cb1-8332-7507fa863003" containerName="rabbitmq" containerID="cri-o://a19c446fa241e94f1615222bf3d5fca8c7643db4cd5a8957e2a102a8a73b03fa" gracePeriod=604799 Mar 18 19:23:36 crc kubenswrapper[5008]: I0318 19:23:36.066759 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f6c00d8d-362a-4cb1-8332-7507fa863003" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.19:5672: connect: connection refused" Mar 18 19:23:36 crc kubenswrapper[5008]: I0318 19:23:36.102482 5008 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="343ee353-693d-44a8-9473-805ad741f837" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.20:5672: connect: connection refused" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.217959 5008 generic.go:334] "Generic (PLEG): container finished" podID="f6c00d8d-362a-4cb1-8332-7507fa863003" containerID="a19c446fa241e94f1615222bf3d5fca8c7643db4cd5a8957e2a102a8a73b03fa" exitCode=0 Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.218059 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6c00d8d-362a-4cb1-8332-7507fa863003","Type":"ContainerDied","Data":"a19c446fa241e94f1615222bf3d5fca8c7643db4cd5a8957e2a102a8a73b03fa"} Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.221469 5008 generic.go:334] "Generic (PLEG): container finished" podID="343ee353-693d-44a8-9473-805ad741f837" containerID="c35fc69a569ca83654762edd7ad8737758df62058cb9500f9b7bd75fa8ca93b4" exitCode=0 Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.221509 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"343ee353-693d-44a8-9473-805ad741f837","Type":"ContainerDied","Data":"c35fc69a569ca83654762edd7ad8737758df62058cb9500f9b7bd75fa8ca93b4"} Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.415199 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.420793 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.604946 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tddgw\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-kube-api-access-tddgw\") pod \"343ee353-693d-44a8-9473-805ad741f837\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.604986 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-erlang-cookie\") pod \"343ee353-693d-44a8-9473-805ad741f837\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605033 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6c00d8d-362a-4cb1-8332-7507fa863003-pod-info\") pod \"f6c00d8d-362a-4cb1-8332-7507fa863003\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605057 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-confd\") pod \"f6c00d8d-362a-4cb1-8332-7507fa863003\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605112 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dftd7\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-kube-api-access-dftd7\") pod \"f6c00d8d-362a-4cb1-8332-7507fa863003\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605141 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/343ee353-693d-44a8-9473-805ad741f837-erlang-cookie-secret\") pod \"343ee353-693d-44a8-9473-805ad741f837\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605241 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") pod \"f6c00d8d-362a-4cb1-8332-7507fa863003\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605263 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-plugins\") pod \"343ee353-693d-44a8-9473-805ad741f837\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605318 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-server-conf\") pod \"f6c00d8d-362a-4cb1-8332-7507fa863003\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605334 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-plugins-conf\") pod \"f6c00d8d-362a-4cb1-8332-7507fa863003\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605412 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") pod \"343ee353-693d-44a8-9473-805ad741f837\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605434 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-server-conf\") pod \"343ee353-693d-44a8-9473-805ad741f837\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605449 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-plugins-conf\") pod \"343ee353-693d-44a8-9473-805ad741f837\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605474 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-plugins\") pod \"f6c00d8d-362a-4cb1-8332-7507fa863003\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605493 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-erlang-cookie\") pod \"f6c00d8d-362a-4cb1-8332-7507fa863003\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605515 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/343ee353-693d-44a8-9473-805ad741f837-pod-info\") pod \"343ee353-693d-44a8-9473-805ad741f837\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605534 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-confd\") pod \"343ee353-693d-44a8-9473-805ad741f837\" (UID: \"343ee353-693d-44a8-9473-805ad741f837\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605551 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6c00d8d-362a-4cb1-8332-7507fa863003-erlang-cookie-secret\") pod \"f6c00d8d-362a-4cb1-8332-7507fa863003\" (UID: \"f6c00d8d-362a-4cb1-8332-7507fa863003\") " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605702 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "343ee353-693d-44a8-9473-805ad741f837" (UID: "343ee353-693d-44a8-9473-805ad741f837"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.605999 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.606513 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "343ee353-693d-44a8-9473-805ad741f837" (UID: "343ee353-693d-44a8-9473-805ad741f837"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.607794 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f6c00d8d-362a-4cb1-8332-7507fa863003" (UID: "f6c00d8d-362a-4cb1-8332-7507fa863003"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.608056 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f6c00d8d-362a-4cb1-8332-7507fa863003" (UID: "f6c00d8d-362a-4cb1-8332-7507fa863003"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.608927 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "343ee353-693d-44a8-9473-805ad741f837" (UID: "343ee353-693d-44a8-9473-805ad741f837"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.611217 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343ee353-693d-44a8-9473-805ad741f837-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "343ee353-693d-44a8-9473-805ad741f837" (UID: "343ee353-693d-44a8-9473-805ad741f837"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.612741 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f6c00d8d-362a-4cb1-8332-7507fa863003" (UID: "f6c00d8d-362a-4cb1-8332-7507fa863003"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.613211 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6c00d8d-362a-4cb1-8332-7507fa863003-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f6c00d8d-362a-4cb1-8332-7507fa863003" (UID: "f6c00d8d-362a-4cb1-8332-7507fa863003"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.613379 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-kube-api-access-tddgw" (OuterVolumeSpecName: "kube-api-access-tddgw") pod "343ee353-693d-44a8-9473-805ad741f837" (UID: "343ee353-693d-44a8-9473-805ad741f837"). InnerVolumeSpecName "kube-api-access-tddgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.618530 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/343ee353-693d-44a8-9473-805ad741f837-pod-info" (OuterVolumeSpecName: "pod-info") pod "343ee353-693d-44a8-9473-805ad741f837" (UID: "343ee353-693d-44a8-9473-805ad741f837"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.618579 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-kube-api-access-dftd7" (OuterVolumeSpecName: "kube-api-access-dftd7") pod "f6c00d8d-362a-4cb1-8332-7507fa863003" (UID: "f6c00d8d-362a-4cb1-8332-7507fa863003"). InnerVolumeSpecName "kube-api-access-dftd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.618666 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f6c00d8d-362a-4cb1-8332-7507fa863003-pod-info" (OuterVolumeSpecName: "pod-info") pod "f6c00d8d-362a-4cb1-8332-7507fa863003" (UID: "f6c00d8d-362a-4cb1-8332-7507fa863003"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.634781 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621" (OuterVolumeSpecName: "persistence") pod "343ee353-693d-44a8-9473-805ad741f837" (UID: "343ee353-693d-44a8-9473-805ad741f837"). InnerVolumeSpecName "pvc-762454e4-da82-424a-943a-c042eec94621". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.639224 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0" (OuterVolumeSpecName: "persistence") pod "f6c00d8d-362a-4cb1-8332-7507fa863003" (UID: "f6c00d8d-362a-4cb1-8332-7507fa863003"). InnerVolumeSpecName "pvc-ee324809-640a-4b29-871f-efb82a37bdb0". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.639358 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-server-conf" (OuterVolumeSpecName: "server-conf") pod "f6c00d8d-362a-4cb1-8332-7507fa863003" (UID: "f6c00d8d-362a-4cb1-8332-7507fa863003"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.648630 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-server-conf" (OuterVolumeSpecName: "server-conf") pod "343ee353-693d-44a8-9473-805ad741f837" (UID: "343ee353-693d-44a8-9473-805ad741f837"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.696191 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "343ee353-693d-44a8-9473-805ad741f837" (UID: "343ee353-693d-44a8-9473-805ad741f837"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707182 5008 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/343ee353-693d-44a8-9473-805ad741f837-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707227 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") on node \"crc\" " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707241 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707253 5008 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707261 5008 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6c00d8d-362a-4cb1-8332-7507fa863003-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707278 5008 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") on node \"crc\" " Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707287 5008 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707296 5008 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/343ee353-693d-44a8-9473-805ad741f837-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707305 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707314 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707322 5008 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/343ee353-693d-44a8-9473-805ad741f837-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707331 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707339 5008 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6c00d8d-362a-4cb1-8332-7507fa863003-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707347 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tddgw\" (UniqueName: \"kubernetes.io/projected/343ee353-693d-44a8-9473-805ad741f837-kube-api-access-tddgw\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707355 5008 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6c00d8d-362a-4cb1-8332-7507fa863003-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.707363 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dftd7\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-kube-api-access-dftd7\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.715328 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f6c00d8d-362a-4cb1-8332-7507fa863003" (UID: "f6c00d8d-362a-4cb1-8332-7507fa863003"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.730367 5008 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.730521 5008 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ee324809-640a-4b29-871f-efb82a37bdb0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0") on node "crc" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.738302 5008 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.738430 5008 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-762454e4-da82-424a-943a-c042eec94621" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621") on node "crc" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.809234 5008 reconciler_common.go:293] "Volume detached for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.809270 5008 reconciler_common.go:293] "Volume detached for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.809284 5008 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6c00d8d-362a-4cb1-8332-7507fa863003-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.899443 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.953147 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-dgh2j"] Mar 18 19:23:42 crc kubenswrapper[5008]: I0318 19:23:42.953395 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" podUID="3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" containerName="dnsmasq-dns" containerID="cri-o://f43f35a583ebc1f425ba8573f1ad653169cc84ec623103eed534ca4ff4a8d85c" gracePeriod=10 Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.233526 5008 generic.go:334] "Generic (PLEG): container finished" podID="3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" containerID="f43f35a583ebc1f425ba8573f1ad653169cc84ec623103eed534ca4ff4a8d85c" exitCode=0 Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.233617 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" event={"ID":"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46","Type":"ContainerDied","Data":"f43f35a583ebc1f425ba8573f1ad653169cc84ec623103eed534ca4ff4a8d85c"} Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.237352 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6c00d8d-362a-4cb1-8332-7507fa863003","Type":"ContainerDied","Data":"1459a1f8c19f1d04228477d108dc8841529b2877e704271eab1ca9f3a7d026f7"} Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.237394 5008 scope.go:117] "RemoveContainer" containerID="a19c446fa241e94f1615222bf3d5fca8c7643db4cd5a8957e2a102a8a73b03fa" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.237608 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.254654 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"343ee353-693d-44a8-9473-805ad741f837","Type":"ContainerDied","Data":"c97a5b3f00691dcc3b4089f958911f79976efdd12829e1080a863778aa980d64"} Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.254721 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.268280 5008 scope.go:117] "RemoveContainer" containerID="cef769c0644db0d2f2ca0b34e22e5338120aaf2c5e915a5948d61ea7c2098d7b" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.309797 5008 scope.go:117] "RemoveContainer" containerID="c35fc69a569ca83654762edd7ad8737758df62058cb9500f9b7bd75fa8ca93b4" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.314709 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.337879 5008 scope.go:117] "RemoveContainer" containerID="20e88dd6639973a378d733dbd06150698bfb01dacf6f820c3b4336077dc66fef" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.338003 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.348440 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:23:43 crc kubenswrapper[5008]: E0318 19:23:43.348871 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343ee353-693d-44a8-9473-805ad741f837" containerName="setup-container" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.348890 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="343ee353-693d-44a8-9473-805ad741f837" containerName="setup-container" Mar 18 19:23:43 crc kubenswrapper[5008]: E0318 19:23:43.348903 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343ee353-693d-44a8-9473-805ad741f837" containerName="rabbitmq" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.348909 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="343ee353-693d-44a8-9473-805ad741f837" containerName="rabbitmq" Mar 18 19:23:43 crc kubenswrapper[5008]: E0318 19:23:43.348920 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c00d8d-362a-4cb1-8332-7507fa863003" containerName="rabbitmq" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.348926 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c00d8d-362a-4cb1-8332-7507fa863003" containerName="rabbitmq" Mar 18 19:23:43 crc kubenswrapper[5008]: E0318 19:23:43.348940 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c00d8d-362a-4cb1-8332-7507fa863003" containerName="setup-container" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.348947 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c00d8d-362a-4cb1-8332-7507fa863003" containerName="setup-container" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.349099 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6c00d8d-362a-4cb1-8332-7507fa863003" containerName="rabbitmq" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.349119 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="343ee353-693d-44a8-9473-805ad741f837" containerName="rabbitmq" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.349844 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.364216 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.364857 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.365123 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.365423 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.365854 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mp58x" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.376390 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.398351 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.411021 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.428658 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.430086 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.433007 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.433651 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.433999 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.434196 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sff5b" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.434332 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.440748 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.454741 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519062 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-config\") pod \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519184 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhd6f\" (UniqueName: \"kubernetes.io/projected/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-kube-api-access-jhd6f\") pod \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519224 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-dns-svc\") pod \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\" (UID: \"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46\") " Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519407 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrbsg\" (UniqueName: \"kubernetes.io/projected/bf16152f-7363-4a9a-906b-237917d3e262-kube-api-access-jrbsg\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519453 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b353952-1871-46aa-b76a-a475bbc9fb42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519471 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b353952-1871-46aa-b76a-a475bbc9fb42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519489 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b353952-1871-46aa-b76a-a475bbc9fb42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519505 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf16152f-7363-4a9a-906b-237917d3e262-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519523 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b353952-1871-46aa-b76a-a475bbc9fb42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519540 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf16152f-7363-4a9a-906b-237917d3e262-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519577 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbw4g\" (UniqueName: \"kubernetes.io/projected/4b353952-1871-46aa-b76a-a475bbc9fb42-kube-api-access-wbw4g\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519595 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf16152f-7363-4a9a-906b-237917d3e262-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519610 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b353952-1871-46aa-b76a-a475bbc9fb42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519636 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519655 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf16152f-7363-4a9a-906b-237917d3e262-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519670 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b353952-1871-46aa-b76a-a475bbc9fb42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519688 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf16152f-7363-4a9a-906b-237917d3e262-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.519715 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b353952-1871-46aa-b76a-a475bbc9fb42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.520864 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.520897 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf16152f-7363-4a9a-906b-237917d3e262-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.520950 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf16152f-7363-4a9a-906b-237917d3e262-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.522948 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-kube-api-access-jhd6f" (OuterVolumeSpecName: "kube-api-access-jhd6f") pod "3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" (UID: "3eb89a23-7c7a-4e77-8f39-85e59d4a9d46"). InnerVolumeSpecName "kube-api-access-jhd6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.547065 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-config" (OuterVolumeSpecName: "config") pod "3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" (UID: "3eb89a23-7c7a-4e77-8f39-85e59d4a9d46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.550953 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" (UID: "3eb89a23-7c7a-4e77-8f39-85e59d4a9d46"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622334 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf16152f-7363-4a9a-906b-237917d3e262-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622421 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbw4g\" (UniqueName: \"kubernetes.io/projected/4b353952-1871-46aa-b76a-a475bbc9fb42-kube-api-access-wbw4g\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622458 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b353952-1871-46aa-b76a-a475bbc9fb42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622492 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf16152f-7363-4a9a-906b-237917d3e262-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622576 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622622 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf16152f-7363-4a9a-906b-237917d3e262-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622653 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b353952-1871-46aa-b76a-a475bbc9fb42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622683 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf16152f-7363-4a9a-906b-237917d3e262-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622733 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b353952-1871-46aa-b76a-a475bbc9fb42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622802 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622840 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf16152f-7363-4a9a-906b-237917d3e262-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622891 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf16152f-7363-4a9a-906b-237917d3e262-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622933 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrbsg\" (UniqueName: \"kubernetes.io/projected/bf16152f-7363-4a9a-906b-237917d3e262-kube-api-access-jrbsg\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.622991 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b353952-1871-46aa-b76a-a475bbc9fb42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.623026 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b353952-1871-46aa-b76a-a475bbc9fb42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.623029 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf16152f-7363-4a9a-906b-237917d3e262-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.623060 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b353952-1871-46aa-b76a-a475bbc9fb42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.623097 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf16152f-7363-4a9a-906b-237917d3e262-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.623133 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b353952-1871-46aa-b76a-a475bbc9fb42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.623204 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.623227 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.623246 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhd6f\" (UniqueName: \"kubernetes.io/projected/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46-kube-api-access-jhd6f\") on node \"crc\" DevicePath \"\"" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.624683 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf16152f-7363-4a9a-906b-237917d3e262-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.625256 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b353952-1871-46aa-b76a-a475bbc9fb42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.625865 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf16152f-7363-4a9a-906b-237917d3e262-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.626140 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b353952-1871-46aa-b76a-a475bbc9fb42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.626640 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b353952-1871-46aa-b76a-a475bbc9fb42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.627739 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf16152f-7363-4a9a-906b-237917d3e262-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.628023 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.628073 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2270bcf2b6cc50749fc4da73a03f1ee9c9c2e70bdb8d1c03e859f2f39eda1765/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.628426 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b353952-1871-46aa-b76a-a475bbc9fb42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.628898 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b353952-1871-46aa-b76a-a475bbc9fb42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.629864 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.629892 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7545baf37603d6959806d2dfe7b0b7d6eb4ba700ba63c9c8a9207fd98584881d/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.629865 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf16152f-7363-4a9a-906b-237917d3e262-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.630711 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf16152f-7363-4a9a-906b-237917d3e262-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.631243 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b353952-1871-46aa-b76a-a475bbc9fb42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.633210 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b353952-1871-46aa-b76a-a475bbc9fb42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.635137 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf16152f-7363-4a9a-906b-237917d3e262-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.644124 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrbsg\" (UniqueName: \"kubernetes.io/projected/bf16152f-7363-4a9a-906b-237917d3e262-kube-api-access-jrbsg\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.645300 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbw4g\" (UniqueName: \"kubernetes.io/projected/4b353952-1871-46aa-b76a-a475bbc9fb42-kube-api-access-wbw4g\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.655226 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ee324809-640a-4b29-871f-efb82a37bdb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee324809-640a-4b29-871f-efb82a37bdb0\") pod \"rabbitmq-cell1-server-0\" (UID: \"4b353952-1871-46aa-b76a-a475bbc9fb42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.662433 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-762454e4-da82-424a-943a-c042eec94621\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-762454e4-da82-424a-943a-c042eec94621\") pod \"rabbitmq-server-0\" (UID: \"bf16152f-7363-4a9a-906b-237917d3e262\") " pod="openstack/rabbitmq-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.745308 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:23:43 crc kubenswrapper[5008]: I0318 19:23:43.771007 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.045630 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.211480 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343ee353-693d-44a8-9473-805ad741f837" path="/var/lib/kubelet/pods/343ee353-693d-44a8-9473-805ad741f837/volumes" Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.212496 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c00d8d-362a-4cb1-8332-7507fa863003" path="/var/lib/kubelet/pods/f6c00d8d-362a-4cb1-8332-7507fa863003/volumes" Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.267656 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf16152f-7363-4a9a-906b-237917d3e262","Type":"ContainerStarted","Data":"186e0399f85055f33ee2a9a4c690bba8df3e15a1767bf56c808b725e88c2b1c2"} Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.269512 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" event={"ID":"3eb89a23-7c7a-4e77-8f39-85e59d4a9d46","Type":"ContainerDied","Data":"bba7b29e3e99f56ef7cf92da7938e09a767846b57b7ebc93c405c3c7b0673043"} Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.269609 5008 scope.go:117] "RemoveContainer" containerID="f43f35a583ebc1f425ba8573f1ad653169cc84ec623103eed534ca4ff4a8d85c" Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.269633 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-dgh2j" Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.289026 5008 scope.go:117] "RemoveContainer" containerID="188e373487b78cf1f4eabd71a3966192dda0ee206008cc2a18ba71089ef9628d" Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.299114 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-dgh2j"] Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.306279 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-dgh2j"] Mar 18 19:23:44 crc kubenswrapper[5008]: I0318 19:23:44.321316 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 19:23:44 crc kubenswrapper[5008]: W0318 19:23:44.362263 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b353952_1871_46aa_b76a_a475bbc9fb42.slice/crio-0c3f62f54911f152a6c6d6889b530dced958067117c67b229dc65c42f7dfe20e WatchSource:0}: Error finding container 0c3f62f54911f152a6c6d6889b530dced958067117c67b229dc65c42f7dfe20e: Status 404 returned error can't find the container with id 0c3f62f54911f152a6c6d6889b530dced958067117c67b229dc65c42f7dfe20e Mar 18 19:23:45 crc kubenswrapper[5008]: I0318 19:23:45.283125 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4b353952-1871-46aa-b76a-a475bbc9fb42","Type":"ContainerStarted","Data":"0c3f62f54911f152a6c6d6889b530dced958067117c67b229dc65c42f7dfe20e"} Mar 18 19:23:46 crc kubenswrapper[5008]: I0318 19:23:46.224340 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" path="/var/lib/kubelet/pods/3eb89a23-7c7a-4e77-8f39-85e59d4a9d46/volumes" Mar 18 19:23:46 crc kubenswrapper[5008]: I0318 19:23:46.295807 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf16152f-7363-4a9a-906b-237917d3e262","Type":"ContainerStarted","Data":"fccde2af6f5a53e6b6a41b8bc82e756e523b7c60b85e423a770054b6c5ff344b"} Mar 18 19:23:46 crc kubenswrapper[5008]: I0318 19:23:46.297919 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4b353952-1871-46aa-b76a-a475bbc9fb42","Type":"ContainerStarted","Data":"084dd8a04e061429215129a0e8c703d04d37af179f51b0072697d133fbb9389b"} Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.159713 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7lfjk"] Mar 18 19:23:50 crc kubenswrapper[5008]: E0318 19:23:50.161430 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" containerName="init" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.161465 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" containerName="init" Mar 18 19:23:50 crc kubenswrapper[5008]: E0318 19:23:50.161501 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" containerName="dnsmasq-dns" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.161520 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" containerName="dnsmasq-dns" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.161968 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb89a23-7c7a-4e77-8f39-85e59d4a9d46" containerName="dnsmasq-dns" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.164358 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.181635 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lfjk"] Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.229671 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-catalog-content\") pod \"certified-operators-7lfjk\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.229769 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28htn\" (UniqueName: \"kubernetes.io/projected/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-kube-api-access-28htn\") pod \"certified-operators-7lfjk\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.229873 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-utilities\") pod \"certified-operators-7lfjk\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.332654 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-utilities\") pod \"certified-operators-7lfjk\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.333252 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-utilities\") pod \"certified-operators-7lfjk\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.333307 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-catalog-content\") pod \"certified-operators-7lfjk\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.332961 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-catalog-content\") pod \"certified-operators-7lfjk\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.333404 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28htn\" (UniqueName: \"kubernetes.io/projected/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-kube-api-access-28htn\") pod \"certified-operators-7lfjk\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.358411 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28htn\" (UniqueName: \"kubernetes.io/projected/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-kube-api-access-28htn\") pod \"certified-operators-7lfjk\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.497528 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:23:50 crc kubenswrapper[5008]: I0318 19:23:50.819752 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lfjk"] Mar 18 19:23:51 crc kubenswrapper[5008]: I0318 19:23:51.345860 5008 generic.go:334] "Generic (PLEG): container finished" podID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerID="bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e" exitCode=0 Mar 18 19:23:51 crc kubenswrapper[5008]: I0318 19:23:51.345928 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lfjk" event={"ID":"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6","Type":"ContainerDied","Data":"bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e"} Mar 18 19:23:51 crc kubenswrapper[5008]: I0318 19:23:51.346836 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lfjk" event={"ID":"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6","Type":"ContainerStarted","Data":"a55c69fa7821b40f65998f8412969a61553d7b1f8cc9c6663362f8d8dd9c8f64"} Mar 18 19:23:53 crc kubenswrapper[5008]: I0318 19:23:53.369179 5008 generic.go:334] "Generic (PLEG): container finished" podID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerID="5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2" exitCode=0 Mar 18 19:23:53 crc kubenswrapper[5008]: I0318 19:23:53.369276 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lfjk" event={"ID":"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6","Type":"ContainerDied","Data":"5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2"} Mar 18 19:23:54 crc kubenswrapper[5008]: I0318 19:23:54.377832 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lfjk" event={"ID":"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6","Type":"ContainerStarted","Data":"eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6"} Mar 18 19:23:54 crc kubenswrapper[5008]: I0318 19:23:54.407466 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7lfjk" podStartSLOduration=1.896432065 podStartE2EDuration="4.407441418s" podCreationTimestamp="2026-03-18 19:23:50 +0000 UTC" firstStartedPulling="2026-03-18 19:23:51.347805045 +0000 UTC m=+4887.867278154" lastFinishedPulling="2026-03-18 19:23:53.858814388 +0000 UTC m=+4890.378287507" observedRunningTime="2026-03-18 19:23:54.40362263 +0000 UTC m=+4890.923095719" watchObservedRunningTime="2026-03-18 19:23:54.407441418 +0000 UTC m=+4890.926914517" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.171948 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564364-9m4dm"] Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.174165 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564364-9m4dm" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.179165 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.179405 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.179993 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.250056 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564364-9m4dm"] Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.294068 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp9ml\" (UniqueName: \"kubernetes.io/projected/06448434-1769-4027-a455-8fcc1ddff0f4-kube-api-access-bp9ml\") pod \"auto-csr-approver-29564364-9m4dm\" (UID: \"06448434-1769-4027-a455-8fcc1ddff0f4\") " pod="openshift-infra/auto-csr-approver-29564364-9m4dm" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.394967 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp9ml\" (UniqueName: \"kubernetes.io/projected/06448434-1769-4027-a455-8fcc1ddff0f4-kube-api-access-bp9ml\") pod \"auto-csr-approver-29564364-9m4dm\" (UID: \"06448434-1769-4027-a455-8fcc1ddff0f4\") " pod="openshift-infra/auto-csr-approver-29564364-9m4dm" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.416499 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp9ml\" (UniqueName: \"kubernetes.io/projected/06448434-1769-4027-a455-8fcc1ddff0f4-kube-api-access-bp9ml\") pod \"auto-csr-approver-29564364-9m4dm\" (UID: \"06448434-1769-4027-a455-8fcc1ddff0f4\") " pod="openshift-infra/auto-csr-approver-29564364-9m4dm" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.498488 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.499147 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.523332 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564364-9m4dm" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.550054 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:24:00 crc kubenswrapper[5008]: I0318 19:24:00.980096 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564364-9m4dm"] Mar 18 19:24:00 crc kubenswrapper[5008]: W0318 19:24:00.988807 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06448434_1769_4027_a455_8fcc1ddff0f4.slice/crio-2e3fd360430bcec1ccbca1f425b2b4005f8f17debc44be9c11b15aa927f1e8ce WatchSource:0}: Error finding container 2e3fd360430bcec1ccbca1f425b2b4005f8f17debc44be9c11b15aa927f1e8ce: Status 404 returned error can't find the container with id 2e3fd360430bcec1ccbca1f425b2b4005f8f17debc44be9c11b15aa927f1e8ce Mar 18 19:24:01 crc kubenswrapper[5008]: I0318 19:24:01.448368 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564364-9m4dm" event={"ID":"06448434-1769-4027-a455-8fcc1ddff0f4","Type":"ContainerStarted","Data":"2e3fd360430bcec1ccbca1f425b2b4005f8f17debc44be9c11b15aa927f1e8ce"} Mar 18 19:24:01 crc kubenswrapper[5008]: I0318 19:24:01.510570 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:24:01 crc kubenswrapper[5008]: I0318 19:24:01.561071 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lfjk"] Mar 18 19:24:03 crc kubenswrapper[5008]: I0318 19:24:03.468711 5008 generic.go:334] "Generic (PLEG): container finished" podID="06448434-1769-4027-a455-8fcc1ddff0f4" containerID="693e20e09e839bab6b160169d6ebb10e485bf25efd6f9e8784901c4b8a82398f" exitCode=0 Mar 18 19:24:03 crc kubenswrapper[5008]: I0318 19:24:03.468795 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564364-9m4dm" event={"ID":"06448434-1769-4027-a455-8fcc1ddff0f4","Type":"ContainerDied","Data":"693e20e09e839bab6b160169d6ebb10e485bf25efd6f9e8784901c4b8a82398f"} Mar 18 19:24:03 crc kubenswrapper[5008]: I0318 19:24:03.469584 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7lfjk" podUID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerName="registry-server" containerID="cri-o://eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6" gracePeriod=2 Mar 18 19:24:03 crc kubenswrapper[5008]: I0318 19:24:03.980736 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.159898 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-catalog-content\") pod \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.160002 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28htn\" (UniqueName: \"kubernetes.io/projected/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-kube-api-access-28htn\") pod \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.160111 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-utilities\") pod \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\" (UID: \"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6\") " Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.161529 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-utilities" (OuterVolumeSpecName: "utilities") pod "e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" (UID: "e7c2e4b9-5892-4f32-8e71-680a5bd52bd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.169898 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-kube-api-access-28htn" (OuterVolumeSpecName: "kube-api-access-28htn") pod "e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" (UID: "e7c2e4b9-5892-4f32-8e71-680a5bd52bd6"). InnerVolumeSpecName "kube-api-access-28htn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.251112 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" (UID: "e7c2e4b9-5892-4f32-8e71-680a5bd52bd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.262359 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.262402 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28htn\" (UniqueName: \"kubernetes.io/projected/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-kube-api-access-28htn\") on node \"crc\" DevicePath \"\"" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.262416 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.483906 5008 generic.go:334] "Generic (PLEG): container finished" podID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerID="eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6" exitCode=0 Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.483972 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lfjk" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.483996 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lfjk" event={"ID":"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6","Type":"ContainerDied","Data":"eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6"} Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.484957 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lfjk" event={"ID":"e7c2e4b9-5892-4f32-8e71-680a5bd52bd6","Type":"ContainerDied","Data":"a55c69fa7821b40f65998f8412969a61553d7b1f8cc9c6663362f8d8dd9c8f64"} Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.485013 5008 scope.go:117] "RemoveContainer" containerID="eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.537449 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lfjk"] Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.537630 5008 scope.go:117] "RemoveContainer" containerID="5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.544868 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7lfjk"] Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.758901 5008 scope.go:117] "RemoveContainer" containerID="bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.829913 5008 scope.go:117] "RemoveContainer" containerID="eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6" Mar 18 19:24:04 crc kubenswrapper[5008]: E0318 19:24:04.830512 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6\": container with ID starting with eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6 not found: ID does not exist" containerID="eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.830634 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6"} err="failed to get container status \"eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6\": rpc error: code = NotFound desc = could not find container \"eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6\": container with ID starting with eda7fc295b0e1092cd99e39c9b3e1f122ba4db6de704190db8dfb6ec05348af6 not found: ID does not exist" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.830677 5008 scope.go:117] "RemoveContainer" containerID="5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2" Mar 18 19:24:04 crc kubenswrapper[5008]: E0318 19:24:04.831363 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2\": container with ID starting with 5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2 not found: ID does not exist" containerID="5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.831430 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2"} err="failed to get container status \"5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2\": rpc error: code = NotFound desc = could not find container \"5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2\": container with ID starting with 5c65783f9607a7881ec87901366aead9bd2cb156e93db2ea1acaf970f27379c2 not found: ID does not exist" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.831464 5008 scope.go:117] "RemoveContainer" containerID="bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e" Mar 18 19:24:04 crc kubenswrapper[5008]: E0318 19:24:04.831970 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e\": container with ID starting with bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e not found: ID does not exist" containerID="bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.832028 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e"} err="failed to get container status \"bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e\": rpc error: code = NotFound desc = could not find container \"bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e\": container with ID starting with bf0f1c246e9f5bb85be6273eb67081fa01677eb3aba756fa7ccfc249475ef46e not found: ID does not exist" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.839818 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564364-9m4dm" Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.973817 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp9ml\" (UniqueName: \"kubernetes.io/projected/06448434-1769-4027-a455-8fcc1ddff0f4-kube-api-access-bp9ml\") pod \"06448434-1769-4027-a455-8fcc1ddff0f4\" (UID: \"06448434-1769-4027-a455-8fcc1ddff0f4\") " Mar 18 19:24:04 crc kubenswrapper[5008]: I0318 19:24:04.978256 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06448434-1769-4027-a455-8fcc1ddff0f4-kube-api-access-bp9ml" (OuterVolumeSpecName: "kube-api-access-bp9ml") pod "06448434-1769-4027-a455-8fcc1ddff0f4" (UID: "06448434-1769-4027-a455-8fcc1ddff0f4"). InnerVolumeSpecName "kube-api-access-bp9ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:24:05 crc kubenswrapper[5008]: I0318 19:24:05.077066 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp9ml\" (UniqueName: \"kubernetes.io/projected/06448434-1769-4027-a455-8fcc1ddff0f4-kube-api-access-bp9ml\") on node \"crc\" DevicePath \"\"" Mar 18 19:24:05 crc kubenswrapper[5008]: I0318 19:24:05.719743 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564364-9m4dm" event={"ID":"06448434-1769-4027-a455-8fcc1ddff0f4","Type":"ContainerDied","Data":"2e3fd360430bcec1ccbca1f425b2b4005f8f17debc44be9c11b15aa927f1e8ce"} Mar 18 19:24:05 crc kubenswrapper[5008]: I0318 19:24:05.719955 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e3fd360430bcec1ccbca1f425b2b4005f8f17debc44be9c11b15aa927f1e8ce" Mar 18 19:24:05 crc kubenswrapper[5008]: I0318 19:24:05.719761 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564364-9m4dm" Mar 18 19:24:05 crc kubenswrapper[5008]: I0318 19:24:05.936800 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564358-m5dff"] Mar 18 19:24:05 crc kubenswrapper[5008]: I0318 19:24:05.942489 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564358-m5dff"] Mar 18 19:24:06 crc kubenswrapper[5008]: I0318 19:24:06.214785 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52398f8d-42bf-41c8-87d6-01abb0f79a04" path="/var/lib/kubelet/pods/52398f8d-42bf-41c8-87d6-01abb0f79a04/volumes" Mar 18 19:24:06 crc kubenswrapper[5008]: I0318 19:24:06.216162 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" path="/var/lib/kubelet/pods/e7c2e4b9-5892-4f32-8e71-680a5bd52bd6/volumes" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.880146 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sz4dr"] Mar 18 19:24:10 crc kubenswrapper[5008]: E0318 19:24:10.882847 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerName="registry-server" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.883129 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerName="registry-server" Mar 18 19:24:10 crc kubenswrapper[5008]: E0318 19:24:10.883291 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerName="extract-utilities" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.883435 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerName="extract-utilities" Mar 18 19:24:10 crc kubenswrapper[5008]: E0318 19:24:10.883617 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerName="extract-content" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.883817 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerName="extract-content" Mar 18 19:24:10 crc kubenswrapper[5008]: E0318 19:24:10.883969 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06448434-1769-4027-a455-8fcc1ddff0f4" containerName="oc" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.884113 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="06448434-1769-4027-a455-8fcc1ddff0f4" containerName="oc" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.884644 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="06448434-1769-4027-a455-8fcc1ddff0f4" containerName="oc" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.884827 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c2e4b9-5892-4f32-8e71-680a5bd52bd6" containerName="registry-server" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.887068 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.892972 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sz4dr"] Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.991423 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbcrf\" (UniqueName: \"kubernetes.io/projected/1f13e170-ee8b-42bd-b267-2aeaec25edfb-kube-api-access-hbcrf\") pod \"community-operators-sz4dr\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.991521 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-utilities\") pod \"community-operators-sz4dr\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:10 crc kubenswrapper[5008]: I0318 19:24:10.991799 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-catalog-content\") pod \"community-operators-sz4dr\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:11 crc kubenswrapper[5008]: I0318 19:24:11.093658 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbcrf\" (UniqueName: \"kubernetes.io/projected/1f13e170-ee8b-42bd-b267-2aeaec25edfb-kube-api-access-hbcrf\") pod \"community-operators-sz4dr\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:11 crc kubenswrapper[5008]: I0318 19:24:11.093732 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-utilities\") pod \"community-operators-sz4dr\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:11 crc kubenswrapper[5008]: I0318 19:24:11.093831 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-catalog-content\") pod \"community-operators-sz4dr\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:11 crc kubenswrapper[5008]: I0318 19:24:11.094404 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-utilities\") pod \"community-operators-sz4dr\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:11 crc kubenswrapper[5008]: I0318 19:24:11.094457 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-catalog-content\") pod \"community-operators-sz4dr\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:11 crc kubenswrapper[5008]: I0318 19:24:11.121950 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbcrf\" (UniqueName: \"kubernetes.io/projected/1f13e170-ee8b-42bd-b267-2aeaec25edfb-kube-api-access-hbcrf\") pod \"community-operators-sz4dr\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:11 crc kubenswrapper[5008]: I0318 19:24:11.215866 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:11 crc kubenswrapper[5008]: I0318 19:24:11.760723 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sz4dr"] Mar 18 19:24:11 crc kubenswrapper[5008]: W0318 19:24:11.766367 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f13e170_ee8b_42bd_b267_2aeaec25edfb.slice/crio-ba456f71168a6d3255650ce80502770a20016b8c8d4d210b6b99e74b20cf59c5 WatchSource:0}: Error finding container ba456f71168a6d3255650ce80502770a20016b8c8d4d210b6b99e74b20cf59c5: Status 404 returned error can't find the container with id ba456f71168a6d3255650ce80502770a20016b8c8d4d210b6b99e74b20cf59c5 Mar 18 19:24:11 crc kubenswrapper[5008]: I0318 19:24:11.780528 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz4dr" event={"ID":"1f13e170-ee8b-42bd-b267-2aeaec25edfb","Type":"ContainerStarted","Data":"ba456f71168a6d3255650ce80502770a20016b8c8d4d210b6b99e74b20cf59c5"} Mar 18 19:24:12 crc kubenswrapper[5008]: I0318 19:24:12.792294 5008 generic.go:334] "Generic (PLEG): container finished" podID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerID="54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5" exitCode=0 Mar 18 19:24:12 crc kubenswrapper[5008]: I0318 19:24:12.792469 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz4dr" event={"ID":"1f13e170-ee8b-42bd-b267-2aeaec25edfb","Type":"ContainerDied","Data":"54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5"} Mar 18 19:24:14 crc kubenswrapper[5008]: I0318 19:24:14.821013 5008 generic.go:334] "Generic (PLEG): container finished" podID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerID="d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7" exitCode=0 Mar 18 19:24:14 crc kubenswrapper[5008]: I0318 19:24:14.821066 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz4dr" event={"ID":"1f13e170-ee8b-42bd-b267-2aeaec25edfb","Type":"ContainerDied","Data":"d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7"} Mar 18 19:24:15 crc kubenswrapper[5008]: I0318 19:24:15.836972 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz4dr" event={"ID":"1f13e170-ee8b-42bd-b267-2aeaec25edfb","Type":"ContainerStarted","Data":"4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47"} Mar 18 19:24:15 crc kubenswrapper[5008]: I0318 19:24:15.881117 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sz4dr" podStartSLOduration=3.378483268 podStartE2EDuration="5.881091144s" podCreationTimestamp="2026-03-18 19:24:10 +0000 UTC" firstStartedPulling="2026-03-18 19:24:12.794545139 +0000 UTC m=+4909.314018218" lastFinishedPulling="2026-03-18 19:24:15.297153005 +0000 UTC m=+4911.816626094" observedRunningTime="2026-03-18 19:24:15.86498883 +0000 UTC m=+4912.384461989" watchObservedRunningTime="2026-03-18 19:24:15.881091144 +0000 UTC m=+4912.400564233" Mar 18 19:24:18 crc kubenswrapper[5008]: I0318 19:24:18.867885 5008 generic.go:334] "Generic (PLEG): container finished" podID="bf16152f-7363-4a9a-906b-237917d3e262" containerID="fccde2af6f5a53e6b6a41b8bc82e756e523b7c60b85e423a770054b6c5ff344b" exitCode=0 Mar 18 19:24:18 crc kubenswrapper[5008]: I0318 19:24:18.868005 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf16152f-7363-4a9a-906b-237917d3e262","Type":"ContainerDied","Data":"fccde2af6f5a53e6b6a41b8bc82e756e523b7c60b85e423a770054b6c5ff344b"} Mar 18 19:24:18 crc kubenswrapper[5008]: I0318 19:24:18.871140 5008 generic.go:334] "Generic (PLEG): container finished" podID="4b353952-1871-46aa-b76a-a475bbc9fb42" containerID="084dd8a04e061429215129a0e8c703d04d37af179f51b0072697d133fbb9389b" exitCode=0 Mar 18 19:24:18 crc kubenswrapper[5008]: I0318 19:24:18.871192 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4b353952-1871-46aa-b76a-a475bbc9fb42","Type":"ContainerDied","Data":"084dd8a04e061429215129a0e8c703d04d37af179f51b0072697d133fbb9389b"} Mar 18 19:24:19 crc kubenswrapper[5008]: I0318 19:24:19.878899 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4b353952-1871-46aa-b76a-a475bbc9fb42","Type":"ContainerStarted","Data":"adbe7bf3bc119c0cc1773a3b622612a84cf44b0f98af220e44d098c6b61ac7ae"} Mar 18 19:24:19 crc kubenswrapper[5008]: I0318 19:24:19.879665 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:24:19 crc kubenswrapper[5008]: I0318 19:24:19.880997 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf16152f-7363-4a9a-906b-237917d3e262","Type":"ContainerStarted","Data":"3bb74c343b7f8a6707a486309f0cff40e837a9ce212170b1dba6c5fe658174e9"} Mar 18 19:24:19 crc kubenswrapper[5008]: I0318 19:24:19.881232 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 19:24:19 crc kubenswrapper[5008]: I0318 19:24:19.913773 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.913749823 podStartE2EDuration="36.913749823s" podCreationTimestamp="2026-03-18 19:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:24:19.907001419 +0000 UTC m=+4916.426474498" watchObservedRunningTime="2026-03-18 19:24:19.913749823 +0000 UTC m=+4916.433222912" Mar 18 19:24:19 crc kubenswrapper[5008]: I0318 19:24:19.941294 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.94126619 podStartE2EDuration="36.94126619s" podCreationTimestamp="2026-03-18 19:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:24:19.940009448 +0000 UTC m=+4916.459482537" watchObservedRunningTime="2026-03-18 19:24:19.94126619 +0000 UTC m=+4916.460739299" Mar 18 19:24:21 crc kubenswrapper[5008]: I0318 19:24:21.216997 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:21 crc kubenswrapper[5008]: I0318 19:24:21.218225 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:21 crc kubenswrapper[5008]: I0318 19:24:21.264589 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:21 crc kubenswrapper[5008]: I0318 19:24:21.955595 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:22 crc kubenswrapper[5008]: I0318 19:24:22.018247 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sz4dr"] Mar 18 19:24:23 crc kubenswrapper[5008]: I0318 19:24:23.925465 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sz4dr" podUID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerName="registry-server" containerID="cri-o://4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47" gracePeriod=2 Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.422358 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.540677 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-utilities\") pod \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.540744 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbcrf\" (UniqueName: \"kubernetes.io/projected/1f13e170-ee8b-42bd-b267-2aeaec25edfb-kube-api-access-hbcrf\") pod \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.540790 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-catalog-content\") pod \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\" (UID: \"1f13e170-ee8b-42bd-b267-2aeaec25edfb\") " Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.541650 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-utilities" (OuterVolumeSpecName: "utilities") pod "1f13e170-ee8b-42bd-b267-2aeaec25edfb" (UID: "1f13e170-ee8b-42bd-b267-2aeaec25edfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.555708 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f13e170-ee8b-42bd-b267-2aeaec25edfb-kube-api-access-hbcrf" (OuterVolumeSpecName: "kube-api-access-hbcrf") pod "1f13e170-ee8b-42bd-b267-2aeaec25edfb" (UID: "1f13e170-ee8b-42bd-b267-2aeaec25edfb"). InnerVolumeSpecName "kube-api-access-hbcrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.642497 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.642533 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbcrf\" (UniqueName: \"kubernetes.io/projected/1f13e170-ee8b-42bd-b267-2aeaec25edfb-kube-api-access-hbcrf\") on node \"crc\" DevicePath \"\"" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.663815 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f13e170-ee8b-42bd-b267-2aeaec25edfb" (UID: "1f13e170-ee8b-42bd-b267-2aeaec25edfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.744378 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f13e170-ee8b-42bd-b267-2aeaec25edfb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.936524 5008 generic.go:334] "Generic (PLEG): container finished" podID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerID="4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47" exitCode=0 Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.936595 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz4dr" event={"ID":"1f13e170-ee8b-42bd-b267-2aeaec25edfb","Type":"ContainerDied","Data":"4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47"} Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.936626 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sz4dr" event={"ID":"1f13e170-ee8b-42bd-b267-2aeaec25edfb","Type":"ContainerDied","Data":"ba456f71168a6d3255650ce80502770a20016b8c8d4d210b6b99e74b20cf59c5"} Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.936647 5008 scope.go:117] "RemoveContainer" containerID="4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.936694 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sz4dr" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.963997 5008 scope.go:117] "RemoveContainer" containerID="d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7" Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.986385 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sz4dr"] Mar 18 19:24:24 crc kubenswrapper[5008]: I0318 19:24:24.991869 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sz4dr"] Mar 18 19:24:25 crc kubenswrapper[5008]: I0318 19:24:25.024919 5008 scope.go:117] "RemoveContainer" containerID="54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5" Mar 18 19:24:25 crc kubenswrapper[5008]: I0318 19:24:25.057726 5008 scope.go:117] "RemoveContainer" containerID="4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47" Mar 18 19:24:25 crc kubenswrapper[5008]: E0318 19:24:25.058280 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47\": container with ID starting with 4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47 not found: ID does not exist" containerID="4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47" Mar 18 19:24:25 crc kubenswrapper[5008]: I0318 19:24:25.058332 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47"} err="failed to get container status \"4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47\": rpc error: code = NotFound desc = could not find container \"4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47\": container with ID starting with 4041c1a9dfcf7373e4193dead8fcb87936ef90f402ace9edd2bb1fc2efc53b47 not found: ID does not exist" Mar 18 19:24:25 crc kubenswrapper[5008]: I0318 19:24:25.058367 5008 scope.go:117] "RemoveContainer" containerID="d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7" Mar 18 19:24:25 crc kubenswrapper[5008]: E0318 19:24:25.058986 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7\": container with ID starting with d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7 not found: ID does not exist" containerID="d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7" Mar 18 19:24:25 crc kubenswrapper[5008]: I0318 19:24:25.059034 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7"} err="failed to get container status \"d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7\": rpc error: code = NotFound desc = could not find container \"d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7\": container with ID starting with d30f7a308aa6dc08c5c5cc372567da9675ddd88b0ca5c9602564cf07542566a7 not found: ID does not exist" Mar 18 19:24:25 crc kubenswrapper[5008]: I0318 19:24:25.059060 5008 scope.go:117] "RemoveContainer" containerID="54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5" Mar 18 19:24:25 crc kubenswrapper[5008]: E0318 19:24:25.059647 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5\": container with ID starting with 54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5 not found: ID does not exist" containerID="54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5" Mar 18 19:24:25 crc kubenswrapper[5008]: I0318 19:24:25.059695 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5"} err="failed to get container status \"54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5\": rpc error: code = NotFound desc = could not find container \"54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5\": container with ID starting with 54546dd824552e8ce07b0a545e3111b0592948224fbf3872836dd8bb12475ea5 not found: ID does not exist" Mar 18 19:24:26 crc kubenswrapper[5008]: I0318 19:24:26.206342 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" path="/var/lib/kubelet/pods/1f13e170-ee8b-42bd-b267-2aeaec25edfb/volumes" Mar 18 19:24:33 crc kubenswrapper[5008]: I0318 19:24:33.752126 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 19:24:33 crc kubenswrapper[5008]: I0318 19:24:33.776867 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.705221 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 19:24:41 crc kubenswrapper[5008]: E0318 19:24:41.706152 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerName="extract-utilities" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.706173 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerName="extract-utilities" Mar 18 19:24:41 crc kubenswrapper[5008]: E0318 19:24:41.706193 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerName="extract-content" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.706200 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerName="extract-content" Mar 18 19:24:41 crc kubenswrapper[5008]: E0318 19:24:41.706210 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerName="registry-server" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.706219 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerName="registry-server" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.706399 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f13e170-ee8b-42bd-b267-2aeaec25edfb" containerName="registry-server" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.707044 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.710841 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5gqqj" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.714936 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.793084 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvvj5\" (UniqueName: \"kubernetes.io/projected/f5dab9d2-bf4c-4d91-8409-9c42c5c91a97-kube-api-access-kvvj5\") pod \"mariadb-client\" (UID: \"f5dab9d2-bf4c-4d91-8409-9c42c5c91a97\") " pod="openstack/mariadb-client" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.894219 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvvj5\" (UniqueName: \"kubernetes.io/projected/f5dab9d2-bf4c-4d91-8409-9c42c5c91a97-kube-api-access-kvvj5\") pod \"mariadb-client\" (UID: \"f5dab9d2-bf4c-4d91-8409-9c42c5c91a97\") " pod="openstack/mariadb-client" Mar 18 19:24:41 crc kubenswrapper[5008]: I0318 19:24:41.914267 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvvj5\" (UniqueName: \"kubernetes.io/projected/f5dab9d2-bf4c-4d91-8409-9c42c5c91a97-kube-api-access-kvvj5\") pod \"mariadb-client\" (UID: \"f5dab9d2-bf4c-4d91-8409-9c42c5c91a97\") " pod="openstack/mariadb-client" Mar 18 19:24:42 crc kubenswrapper[5008]: I0318 19:24:42.085602 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:24:42 crc kubenswrapper[5008]: I0318 19:24:42.647186 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:24:42 crc kubenswrapper[5008]: I0318 19:24:42.752300 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:24:43 crc kubenswrapper[5008]: I0318 19:24:43.126493 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f5dab9d2-bf4c-4d91-8409-9c42c5c91a97","Type":"ContainerStarted","Data":"bbdc843055e85c3eaf21fcd6104da9766b7dd85a35569223938ce1bf6fd02c3e"} Mar 18 19:24:44 crc kubenswrapper[5008]: I0318 19:24:44.389008 5008 scope.go:117] "RemoveContainer" containerID="a749c07a415c12f78e7aa072e5e914dceabc4e883ab02814fff9eaddd27f819d" Mar 18 19:24:50 crc kubenswrapper[5008]: I0318 19:24:50.187620 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f5dab9d2-bf4c-4d91-8409-9c42c5c91a97","Type":"ContainerStarted","Data":"9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4"} Mar 18 19:24:50 crc kubenswrapper[5008]: I0318 19:24:50.213053 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.191699364 podStartE2EDuration="9.21302093s" podCreationTimestamp="2026-03-18 19:24:41 +0000 UTC" firstStartedPulling="2026-03-18 19:24:42.752060316 +0000 UTC m=+4939.271533395" lastFinishedPulling="2026-03-18 19:24:49.773381882 +0000 UTC m=+4946.292854961" observedRunningTime="2026-03-18 19:24:50.209284634 +0000 UTC m=+4946.728757773" watchObservedRunningTime="2026-03-18 19:24:50.21302093 +0000 UTC m=+4946.732494069" Mar 18 19:24:54 crc kubenswrapper[5008]: I0318 19:24:54.460222 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:24:54 crc kubenswrapper[5008]: I0318 19:24:54.461070 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:25:04 crc kubenswrapper[5008]: I0318 19:25:04.224657 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:25:04 crc kubenswrapper[5008]: I0318 19:25:04.225447 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="f5dab9d2-bf4c-4d91-8409-9c42c5c91a97" containerName="mariadb-client" containerID="cri-o://9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4" gracePeriod=30 Mar 18 19:25:04 crc kubenswrapper[5008]: I0318 19:25:04.748025 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:25:04 crc kubenswrapper[5008]: I0318 19:25:04.865176 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvvj5\" (UniqueName: \"kubernetes.io/projected/f5dab9d2-bf4c-4d91-8409-9c42c5c91a97-kube-api-access-kvvj5\") pod \"f5dab9d2-bf4c-4d91-8409-9c42c5c91a97\" (UID: \"f5dab9d2-bf4c-4d91-8409-9c42c5c91a97\") " Mar 18 19:25:04 crc kubenswrapper[5008]: I0318 19:25:04.871220 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5dab9d2-bf4c-4d91-8409-9c42c5c91a97-kube-api-access-kvvj5" (OuterVolumeSpecName: "kube-api-access-kvvj5") pod "f5dab9d2-bf4c-4d91-8409-9c42c5c91a97" (UID: "f5dab9d2-bf4c-4d91-8409-9c42c5c91a97"). InnerVolumeSpecName "kube-api-access-kvvj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:25:04 crc kubenswrapper[5008]: I0318 19:25:04.967478 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvvj5\" (UniqueName: \"kubernetes.io/projected/f5dab9d2-bf4c-4d91-8409-9c42c5c91a97-kube-api-access-kvvj5\") on node \"crc\" DevicePath \"\"" Mar 18 19:25:05 crc kubenswrapper[5008]: I0318 19:25:05.636077 5008 generic.go:334] "Generic (PLEG): container finished" podID="f5dab9d2-bf4c-4d91-8409-9c42c5c91a97" containerID="9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4" exitCode=143 Mar 18 19:25:05 crc kubenswrapper[5008]: I0318 19:25:05.636121 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f5dab9d2-bf4c-4d91-8409-9c42c5c91a97","Type":"ContainerDied","Data":"9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4"} Mar 18 19:25:05 crc kubenswrapper[5008]: I0318 19:25:05.636147 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f5dab9d2-bf4c-4d91-8409-9c42c5c91a97","Type":"ContainerDied","Data":"bbdc843055e85c3eaf21fcd6104da9766b7dd85a35569223938ce1bf6fd02c3e"} Mar 18 19:25:05 crc kubenswrapper[5008]: I0318 19:25:05.636162 5008 scope.go:117] "RemoveContainer" containerID="9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4" Mar 18 19:25:05 crc kubenswrapper[5008]: I0318 19:25:05.636275 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:25:05 crc kubenswrapper[5008]: I0318 19:25:05.659975 5008 scope.go:117] "RemoveContainer" containerID="9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4" Mar 18 19:25:05 crc kubenswrapper[5008]: E0318 19:25:05.663767 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4\": container with ID starting with 9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4 not found: ID does not exist" containerID="9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4" Mar 18 19:25:05 crc kubenswrapper[5008]: I0318 19:25:05.663837 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4"} err="failed to get container status \"9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4\": rpc error: code = NotFound desc = could not find container \"9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4\": container with ID starting with 9b9b213fa850ef1ea2afb1631a0ea5c5a7dbb3f435d94a8e6be01ca123945ef4 not found: ID does not exist" Mar 18 19:25:05 crc kubenswrapper[5008]: I0318 19:25:05.671053 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:25:05 crc kubenswrapper[5008]: I0318 19:25:05.677336 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:25:06 crc kubenswrapper[5008]: I0318 19:25:06.212961 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5dab9d2-bf4c-4d91-8409-9c42c5c91a97" path="/var/lib/kubelet/pods/f5dab9d2-bf4c-4d91-8409-9c42c5c91a97/volumes" Mar 18 19:25:24 crc kubenswrapper[5008]: I0318 19:25:24.460473 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:25:24 crc kubenswrapper[5008]: I0318 19:25:24.461882 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:25:44 crc kubenswrapper[5008]: I0318 19:25:44.496808 5008 scope.go:117] "RemoveContainer" containerID="c9d227e75affc9bd49500787dd7499aea5f14d0fde407bf55af0e651a47a380d" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.388957 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-625vp"] Mar 18 19:25:51 crc kubenswrapper[5008]: E0318 19:25:51.390433 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5dab9d2-bf4c-4d91-8409-9c42c5c91a97" containerName="mariadb-client" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.390464 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5dab9d2-bf4c-4d91-8409-9c42c5c91a97" containerName="mariadb-client" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.390920 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5dab9d2-bf4c-4d91-8409-9c42c5c91a97" containerName="mariadb-client" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.393606 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.402238 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-625vp"] Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.488435 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-utilities\") pod \"redhat-marketplace-625vp\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.488626 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkwtr\" (UniqueName: \"kubernetes.io/projected/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-kube-api-access-jkwtr\") pod \"redhat-marketplace-625vp\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.488690 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-catalog-content\") pod \"redhat-marketplace-625vp\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.592654 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkwtr\" (UniqueName: \"kubernetes.io/projected/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-kube-api-access-jkwtr\") pod \"redhat-marketplace-625vp\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.592771 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-catalog-content\") pod \"redhat-marketplace-625vp\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.593041 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-utilities\") pod \"redhat-marketplace-625vp\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.594090 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-utilities\") pod \"redhat-marketplace-625vp\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.595208 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-catalog-content\") pod \"redhat-marketplace-625vp\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.622848 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkwtr\" (UniqueName: \"kubernetes.io/projected/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-kube-api-access-jkwtr\") pod \"redhat-marketplace-625vp\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:51 crc kubenswrapper[5008]: I0318 19:25:51.742302 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:25:52 crc kubenswrapper[5008]: I0318 19:25:52.161860 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-625vp"] Mar 18 19:25:53 crc kubenswrapper[5008]: I0318 19:25:53.076613 5008 generic.go:334] "Generic (PLEG): container finished" podID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerID="53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511" exitCode=0 Mar 18 19:25:53 crc kubenswrapper[5008]: I0318 19:25:53.076756 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-625vp" event={"ID":"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c","Type":"ContainerDied","Data":"53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511"} Mar 18 19:25:53 crc kubenswrapper[5008]: I0318 19:25:53.076985 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-625vp" event={"ID":"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c","Type":"ContainerStarted","Data":"a1bce4fafbd284ec2e86f911d80c252a1953095b2e98a07b52bc9c6d78619c2f"} Mar 18 19:25:53 crc kubenswrapper[5008]: I0318 19:25:53.783917 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zrsgq"] Mar 18 19:25:53 crc kubenswrapper[5008]: I0318 19:25:53.791140 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:53 crc kubenswrapper[5008]: I0318 19:25:53.797212 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrsgq"] Mar 18 19:25:53 crc kubenswrapper[5008]: I0318 19:25:53.934110 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-utilities\") pod \"redhat-operators-zrsgq\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:53 crc kubenswrapper[5008]: I0318 19:25:53.934866 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwd9l\" (UniqueName: \"kubernetes.io/projected/c30c8ca8-0c63-42ed-8415-4791cd311fd7-kube-api-access-wwd9l\") pod \"redhat-operators-zrsgq\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:53 crc kubenswrapper[5008]: I0318 19:25:53.935115 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-catalog-content\") pod \"redhat-operators-zrsgq\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.036876 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-catalog-content\") pod \"redhat-operators-zrsgq\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.037221 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-utilities\") pod \"redhat-operators-zrsgq\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.037379 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwd9l\" (UniqueName: \"kubernetes.io/projected/c30c8ca8-0c63-42ed-8415-4791cd311fd7-kube-api-access-wwd9l\") pod \"redhat-operators-zrsgq\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.037658 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-catalog-content\") pod \"redhat-operators-zrsgq\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.037811 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-utilities\") pod \"redhat-operators-zrsgq\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.068140 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwd9l\" (UniqueName: \"kubernetes.io/projected/c30c8ca8-0c63-42ed-8415-4791cd311fd7-kube-api-access-wwd9l\") pod \"redhat-operators-zrsgq\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.087272 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-625vp" event={"ID":"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c","Type":"ContainerStarted","Data":"fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c"} Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.124326 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.387098 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrsgq"] Mar 18 19:25:54 crc kubenswrapper[5008]: W0318 19:25:54.391675 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30c8ca8_0c63_42ed_8415_4791cd311fd7.slice/crio-2ff0a67b7a08e03388ea7c3c7ef9502561d80164682fd4bc890ae748cec95119 WatchSource:0}: Error finding container 2ff0a67b7a08e03388ea7c3c7ef9502561d80164682fd4bc890ae748cec95119: Status 404 returned error can't find the container with id 2ff0a67b7a08e03388ea7c3c7ef9502561d80164682fd4bc890ae748cec95119 Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.460378 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.460445 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.460494 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.461273 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:25:54 crc kubenswrapper[5008]: I0318 19:25:54.461347 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" gracePeriod=600 Mar 18 19:25:54 crc kubenswrapper[5008]: E0318 19:25:54.579292 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:25:55 crc kubenswrapper[5008]: I0318 19:25:55.095575 5008 generic.go:334] "Generic (PLEG): container finished" podID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerID="fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c" exitCode=0 Mar 18 19:25:55 crc kubenswrapper[5008]: I0318 19:25:55.095628 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-625vp" event={"ID":"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c","Type":"ContainerDied","Data":"fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c"} Mar 18 19:25:55 crc kubenswrapper[5008]: I0318 19:25:55.098664 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" exitCode=0 Mar 18 19:25:55 crc kubenswrapper[5008]: I0318 19:25:55.098738 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7"} Mar 18 19:25:55 crc kubenswrapper[5008]: I0318 19:25:55.098774 5008 scope.go:117] "RemoveContainer" containerID="4d595231fbacf4e90ffb123dbddc3f5bb05b324fc9dae73ed4f109d00d75ea52" Mar 18 19:25:55 crc kubenswrapper[5008]: I0318 19:25:55.099185 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:25:55 crc kubenswrapper[5008]: E0318 19:25:55.099362 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:25:55 crc kubenswrapper[5008]: I0318 19:25:55.101014 5008 generic.go:334] "Generic (PLEG): container finished" podID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerID="2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283" exitCode=0 Mar 18 19:25:55 crc kubenswrapper[5008]: I0318 19:25:55.101040 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrsgq" event={"ID":"c30c8ca8-0c63-42ed-8415-4791cd311fd7","Type":"ContainerDied","Data":"2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283"} Mar 18 19:25:55 crc kubenswrapper[5008]: I0318 19:25:55.101062 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrsgq" event={"ID":"c30c8ca8-0c63-42ed-8415-4791cd311fd7","Type":"ContainerStarted","Data":"2ff0a67b7a08e03388ea7c3c7ef9502561d80164682fd4bc890ae748cec95119"} Mar 18 19:25:56 crc kubenswrapper[5008]: I0318 19:25:56.117831 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-625vp" event={"ID":"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c","Type":"ContainerStarted","Data":"e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa"} Mar 18 19:25:56 crc kubenswrapper[5008]: I0318 19:25:56.139935 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-625vp" podStartSLOduration=2.764976338 podStartE2EDuration="5.139918749s" podCreationTimestamp="2026-03-18 19:25:51 +0000 UTC" firstStartedPulling="2026-03-18 19:25:53.079003315 +0000 UTC m=+5009.598476424" lastFinishedPulling="2026-03-18 19:25:55.453945756 +0000 UTC m=+5011.973418835" observedRunningTime="2026-03-18 19:25:56.136003389 +0000 UTC m=+5012.655476478" watchObservedRunningTime="2026-03-18 19:25:56.139918749 +0000 UTC m=+5012.659391828" Mar 18 19:25:57 crc kubenswrapper[5008]: I0318 19:25:57.126853 5008 generic.go:334] "Generic (PLEG): container finished" podID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerID="d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c" exitCode=0 Mar 18 19:25:57 crc kubenswrapper[5008]: I0318 19:25:57.126920 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrsgq" event={"ID":"c30c8ca8-0c63-42ed-8415-4791cd311fd7","Type":"ContainerDied","Data":"d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c"} Mar 18 19:25:59 crc kubenswrapper[5008]: I0318 19:25:59.148511 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrsgq" event={"ID":"c30c8ca8-0c63-42ed-8415-4791cd311fd7","Type":"ContainerStarted","Data":"1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc"} Mar 18 19:25:59 crc kubenswrapper[5008]: I0318 19:25:59.179507 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zrsgq" podStartSLOduration=2.553496826 podStartE2EDuration="6.179482575s" podCreationTimestamp="2026-03-18 19:25:53 +0000 UTC" firstStartedPulling="2026-03-18 19:25:55.102066867 +0000 UTC m=+5011.621539936" lastFinishedPulling="2026-03-18 19:25:58.728052596 +0000 UTC m=+5015.247525685" observedRunningTime="2026-03-18 19:25:59.173323007 +0000 UTC m=+5015.692796156" watchObservedRunningTime="2026-03-18 19:25:59.179482575 +0000 UTC m=+5015.698955664" Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.158767 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564366-jn9mc"] Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.161713 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564366-jn9mc" Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.167988 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.170201 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.175481 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564366-jn9mc"] Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.176453 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.233749 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w94wh\" (UniqueName: \"kubernetes.io/projected/1be5f5c0-c2ef-41d2-91ea-ebe8631b2566-kube-api-access-w94wh\") pod \"auto-csr-approver-29564366-jn9mc\" (UID: \"1be5f5c0-c2ef-41d2-91ea-ebe8631b2566\") " pod="openshift-infra/auto-csr-approver-29564366-jn9mc" Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.337062 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94wh\" (UniqueName: \"kubernetes.io/projected/1be5f5c0-c2ef-41d2-91ea-ebe8631b2566-kube-api-access-w94wh\") pod \"auto-csr-approver-29564366-jn9mc\" (UID: \"1be5f5c0-c2ef-41d2-91ea-ebe8631b2566\") " pod="openshift-infra/auto-csr-approver-29564366-jn9mc" Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.363245 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94wh\" (UniqueName: \"kubernetes.io/projected/1be5f5c0-c2ef-41d2-91ea-ebe8631b2566-kube-api-access-w94wh\") pod \"auto-csr-approver-29564366-jn9mc\" (UID: \"1be5f5c0-c2ef-41d2-91ea-ebe8631b2566\") " pod="openshift-infra/auto-csr-approver-29564366-jn9mc" Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.477341 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564366-jn9mc" Mar 18 19:26:00 crc kubenswrapper[5008]: I0318 19:26:00.897873 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564366-jn9mc"] Mar 18 19:26:00 crc kubenswrapper[5008]: W0318 19:26:00.909979 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1be5f5c0_c2ef_41d2_91ea_ebe8631b2566.slice/crio-9b1a7d412ffd63a27eba02b1340a0f5da4855aec18114c062a79b6be7d82094b WatchSource:0}: Error finding container 9b1a7d412ffd63a27eba02b1340a0f5da4855aec18114c062a79b6be7d82094b: Status 404 returned error can't find the container with id 9b1a7d412ffd63a27eba02b1340a0f5da4855aec18114c062a79b6be7d82094b Mar 18 19:26:01 crc kubenswrapper[5008]: I0318 19:26:01.180814 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564366-jn9mc" event={"ID":"1be5f5c0-c2ef-41d2-91ea-ebe8631b2566","Type":"ContainerStarted","Data":"9b1a7d412ffd63a27eba02b1340a0f5da4855aec18114c062a79b6be7d82094b"} Mar 18 19:26:01 crc kubenswrapper[5008]: I0318 19:26:01.742775 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:26:01 crc kubenswrapper[5008]: I0318 19:26:01.743225 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:26:01 crc kubenswrapper[5008]: I0318 19:26:01.808473 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:26:02 crc kubenswrapper[5008]: I0318 19:26:02.255907 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:26:02 crc kubenswrapper[5008]: I0318 19:26:02.968634 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-625vp"] Mar 18 19:26:03 crc kubenswrapper[5008]: I0318 19:26:03.197082 5008 generic.go:334] "Generic (PLEG): container finished" podID="1be5f5c0-c2ef-41d2-91ea-ebe8631b2566" containerID="4a642dbc62e9f7a8a7cc2d5122326b8d39353cc03ba3972772ecf36c4cb6c0fd" exitCode=0 Mar 18 19:26:03 crc kubenswrapper[5008]: I0318 19:26:03.197127 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564366-jn9mc" event={"ID":"1be5f5c0-c2ef-41d2-91ea-ebe8631b2566","Type":"ContainerDied","Data":"4a642dbc62e9f7a8a7cc2d5122326b8d39353cc03ba3972772ecf36c4cb6c0fd"} Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.125192 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.126831 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.209834 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-625vp" podUID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerName="registry-server" containerID="cri-o://e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa" gracePeriod=2 Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.556473 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564366-jn9mc" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.634731 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.707308 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtr\" (UniqueName: \"kubernetes.io/projected/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-kube-api-access-jkwtr\") pod \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.707434 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w94wh\" (UniqueName: \"kubernetes.io/projected/1be5f5c0-c2ef-41d2-91ea-ebe8631b2566-kube-api-access-w94wh\") pod \"1be5f5c0-c2ef-41d2-91ea-ebe8631b2566\" (UID: \"1be5f5c0-c2ef-41d2-91ea-ebe8631b2566\") " Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.707491 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-catalog-content\") pod \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.707580 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-utilities\") pod \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\" (UID: \"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c\") " Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.708527 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-utilities" (OuterVolumeSpecName: "utilities") pod "ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" (UID: "ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.713732 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be5f5c0-c2ef-41d2-91ea-ebe8631b2566-kube-api-access-w94wh" (OuterVolumeSpecName: "kube-api-access-w94wh") pod "1be5f5c0-c2ef-41d2-91ea-ebe8631b2566" (UID: "1be5f5c0-c2ef-41d2-91ea-ebe8631b2566"). InnerVolumeSpecName "kube-api-access-w94wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.718430 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-kube-api-access-jkwtr" (OuterVolumeSpecName: "kube-api-access-jkwtr") pod "ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" (UID: "ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c"). InnerVolumeSpecName "kube-api-access-jkwtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.809513 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtr\" (UniqueName: \"kubernetes.io/projected/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-kube-api-access-jkwtr\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.809572 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w94wh\" (UniqueName: \"kubernetes.io/projected/1be5f5c0-c2ef-41d2-91ea-ebe8631b2566-kube-api-access-w94wh\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.809586 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.901741 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" (UID: "ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:04 crc kubenswrapper[5008]: I0318 19:26:04.911361 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.197071 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zrsgq" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerName="registry-server" probeResult="failure" output=< Mar 18 19:26:05 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 19:26:05 crc kubenswrapper[5008]: > Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.219373 5008 generic.go:334] "Generic (PLEG): container finished" podID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerID="e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa" exitCode=0 Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.219477 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-625vp" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.219516 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-625vp" event={"ID":"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c","Type":"ContainerDied","Data":"e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa"} Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.219589 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-625vp" event={"ID":"ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c","Type":"ContainerDied","Data":"a1bce4fafbd284ec2e86f911d80c252a1953095b2e98a07b52bc9c6d78619c2f"} Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.219624 5008 scope.go:117] "RemoveContainer" containerID="e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.223032 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564366-jn9mc" event={"ID":"1be5f5c0-c2ef-41d2-91ea-ebe8631b2566","Type":"ContainerDied","Data":"9b1a7d412ffd63a27eba02b1340a0f5da4855aec18114c062a79b6be7d82094b"} Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.223067 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b1a7d412ffd63a27eba02b1340a0f5da4855aec18114c062a79b6be7d82094b" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.223127 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564366-jn9mc" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.250838 5008 scope.go:117] "RemoveContainer" containerID="fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.284877 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-625vp"] Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.295204 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-625vp"] Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.311718 5008 scope.go:117] "RemoveContainer" containerID="53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.370868 5008 scope.go:117] "RemoveContainer" containerID="e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa" Mar 18 19:26:05 crc kubenswrapper[5008]: E0318 19:26:05.371355 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa\": container with ID starting with e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa not found: ID does not exist" containerID="e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.371394 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa"} err="failed to get container status \"e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa\": rpc error: code = NotFound desc = could not find container \"e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa\": container with ID starting with e6f0544d262020bc14f7452cdfc273f5f26493c907b045c6c7a10213adfb6daa not found: ID does not exist" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.371427 5008 scope.go:117] "RemoveContainer" containerID="fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c" Mar 18 19:26:05 crc kubenswrapper[5008]: E0318 19:26:05.375749 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c\": container with ID starting with fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c not found: ID does not exist" containerID="fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.375817 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c"} err="failed to get container status \"fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c\": rpc error: code = NotFound desc = could not find container \"fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c\": container with ID starting with fe3aa94d8e876086603d071505b183fe5f0d10eded8c27984d27c090b8a4191c not found: ID does not exist" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.375850 5008 scope.go:117] "RemoveContainer" containerID="53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511" Mar 18 19:26:05 crc kubenswrapper[5008]: E0318 19:26:05.376341 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511\": container with ID starting with 53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511 not found: ID does not exist" containerID="53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.376372 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511"} err="failed to get container status \"53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511\": rpc error: code = NotFound desc = could not find container \"53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511\": container with ID starting with 53416b50ea7b9e831883ddd6c80e2405ee5b3e1d957268c2ab2fd34b1acb2511 not found: ID does not exist" Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.630675 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564360-v5fqc"] Mar 18 19:26:05 crc kubenswrapper[5008]: I0318 19:26:05.639088 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564360-v5fqc"] Mar 18 19:26:06 crc kubenswrapper[5008]: I0318 19:26:06.213754 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db7a27e-6ce2-4068-aff2-0f476262ec37" path="/var/lib/kubelet/pods/8db7a27e-6ce2-4068-aff2-0f476262ec37/volumes" Mar 18 19:26:06 crc kubenswrapper[5008]: I0318 19:26:06.215214 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" path="/var/lib/kubelet/pods/ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c/volumes" Mar 18 19:26:10 crc kubenswrapper[5008]: I0318 19:26:10.199068 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:26:10 crc kubenswrapper[5008]: E0318 19:26:10.200226 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:26:14 crc kubenswrapper[5008]: I0318 19:26:14.214367 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:26:14 crc kubenswrapper[5008]: I0318 19:26:14.292022 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:26:14 crc kubenswrapper[5008]: I0318 19:26:14.467252 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrsgq"] Mar 18 19:26:15 crc kubenswrapper[5008]: I0318 19:26:15.326460 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zrsgq" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerName="registry-server" containerID="cri-o://1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc" gracePeriod=2 Mar 18 19:26:15 crc kubenswrapper[5008]: I0318 19:26:15.758753 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:26:15 crc kubenswrapper[5008]: I0318 19:26:15.905423 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwd9l\" (UniqueName: \"kubernetes.io/projected/c30c8ca8-0c63-42ed-8415-4791cd311fd7-kube-api-access-wwd9l\") pod \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " Mar 18 19:26:15 crc kubenswrapper[5008]: I0318 19:26:15.905530 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-utilities\") pod \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " Mar 18 19:26:15 crc kubenswrapper[5008]: I0318 19:26:15.905580 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-catalog-content\") pod \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\" (UID: \"c30c8ca8-0c63-42ed-8415-4791cd311fd7\") " Mar 18 19:26:15 crc kubenswrapper[5008]: I0318 19:26:15.907029 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-utilities" (OuterVolumeSpecName: "utilities") pod "c30c8ca8-0c63-42ed-8415-4791cd311fd7" (UID: "c30c8ca8-0c63-42ed-8415-4791cd311fd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:15 crc kubenswrapper[5008]: I0318 19:26:15.915218 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30c8ca8-0c63-42ed-8415-4791cd311fd7-kube-api-access-wwd9l" (OuterVolumeSpecName: "kube-api-access-wwd9l") pod "c30c8ca8-0c63-42ed-8415-4791cd311fd7" (UID: "c30c8ca8-0c63-42ed-8415-4791cd311fd7"). InnerVolumeSpecName "kube-api-access-wwd9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.007517 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.007562 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwd9l\" (UniqueName: \"kubernetes.io/projected/c30c8ca8-0c63-42ed-8415-4791cd311fd7-kube-api-access-wwd9l\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.069592 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c30c8ca8-0c63-42ed-8415-4791cd311fd7" (UID: "c30c8ca8-0c63-42ed-8415-4791cd311fd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.109402 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30c8ca8-0c63-42ed-8415-4791cd311fd7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.339281 5008 generic.go:334] "Generic (PLEG): container finished" podID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerID="1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc" exitCode=0 Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.339326 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrsgq" event={"ID":"c30c8ca8-0c63-42ed-8415-4791cd311fd7","Type":"ContainerDied","Data":"1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc"} Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.339429 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrsgq" event={"ID":"c30c8ca8-0c63-42ed-8415-4791cd311fd7","Type":"ContainerDied","Data":"2ff0a67b7a08e03388ea7c3c7ef9502561d80164682fd4bc890ae748cec95119"} Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.339464 5008 scope.go:117] "RemoveContainer" containerID="1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.339345 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrsgq" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.378731 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrsgq"] Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.390199 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zrsgq"] Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.392378 5008 scope.go:117] "RemoveContainer" containerID="d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.424543 5008 scope.go:117] "RemoveContainer" containerID="2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.456920 5008 scope.go:117] "RemoveContainer" containerID="1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc" Mar 18 19:26:16 crc kubenswrapper[5008]: E0318 19:26:16.457521 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc\": container with ID starting with 1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc not found: ID does not exist" containerID="1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.457592 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc"} err="failed to get container status \"1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc\": rpc error: code = NotFound desc = could not find container \"1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc\": container with ID starting with 1a3bdf0e2fa19463463de5bf7d30be272fa41e93c6c83f4754cf33008973a1dc not found: ID does not exist" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.457627 5008 scope.go:117] "RemoveContainer" containerID="d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c" Mar 18 19:26:16 crc kubenswrapper[5008]: E0318 19:26:16.458253 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c\": container with ID starting with d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c not found: ID does not exist" containerID="d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.458279 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c"} err="failed to get container status \"d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c\": rpc error: code = NotFound desc = could not find container \"d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c\": container with ID starting with d1420c3d9e32cd1796394c08ab7ab841bc1cdd57f833c031860c63785c9f177c not found: ID does not exist" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.458298 5008 scope.go:117] "RemoveContainer" containerID="2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283" Mar 18 19:26:16 crc kubenswrapper[5008]: E0318 19:26:16.458990 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283\": container with ID starting with 2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283 not found: ID does not exist" containerID="2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283" Mar 18 19:26:16 crc kubenswrapper[5008]: I0318 19:26:16.459051 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283"} err="failed to get container status \"2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283\": rpc error: code = NotFound desc = could not find container \"2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283\": container with ID starting with 2652b98bfdb588352053f6ddf00f2dfd6ab3dfa22e096b0a6b4eebcaf7635283 not found: ID does not exist" Mar 18 19:26:18 crc kubenswrapper[5008]: I0318 19:26:18.209536 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" path="/var/lib/kubelet/pods/c30c8ca8-0c63-42ed-8415-4791cd311fd7/volumes" Mar 18 19:26:23 crc kubenswrapper[5008]: I0318 19:26:23.198261 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:26:23 crc kubenswrapper[5008]: E0318 19:26:23.199017 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:26:36 crc kubenswrapper[5008]: I0318 19:26:36.198104 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:26:36 crc kubenswrapper[5008]: E0318 19:26:36.198946 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:26:44 crc kubenswrapper[5008]: I0318 19:26:44.566647 5008 scope.go:117] "RemoveContainer" containerID="8536dbd92da6ca89cfc54e81f61eeec43f9d339c033b1fb2ef4cbb3acf8e4bc5" Mar 18 19:26:48 crc kubenswrapper[5008]: I0318 19:26:48.197943 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:26:48 crc kubenswrapper[5008]: E0318 19:26:48.198506 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:26:59 crc kubenswrapper[5008]: I0318 19:26:59.198863 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:26:59 crc kubenswrapper[5008]: E0318 19:26:59.199944 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:27:11 crc kubenswrapper[5008]: I0318 19:27:11.199348 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:27:11 crc kubenswrapper[5008]: E0318 19:27:11.200827 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:27:22 crc kubenswrapper[5008]: I0318 19:27:22.198392 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:27:22 crc kubenswrapper[5008]: E0318 19:27:22.199540 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:27:33 crc kubenswrapper[5008]: I0318 19:27:33.198444 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:27:33 crc kubenswrapper[5008]: E0318 19:27:33.199605 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:27:47 crc kubenswrapper[5008]: I0318 19:27:47.198750 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:27:47 crc kubenswrapper[5008]: E0318 19:27:47.199625 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.166662 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564368-pw579"] Mar 18 19:28:00 crc kubenswrapper[5008]: E0318 19:28:00.167905 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerName="extract-utilities" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.167929 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerName="extract-utilities" Mar 18 19:28:00 crc kubenswrapper[5008]: E0318 19:28:00.167951 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerName="extract-content" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.167964 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerName="extract-content" Mar 18 19:28:00 crc kubenswrapper[5008]: E0318 19:28:00.167990 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerName="registry-server" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.168003 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerName="registry-server" Mar 18 19:28:00 crc kubenswrapper[5008]: E0318 19:28:00.168030 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be5f5c0-c2ef-41d2-91ea-ebe8631b2566" containerName="oc" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.168043 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be5f5c0-c2ef-41d2-91ea-ebe8631b2566" containerName="oc" Mar 18 19:28:00 crc kubenswrapper[5008]: E0318 19:28:00.168063 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerName="extract-utilities" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.168075 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerName="extract-utilities" Mar 18 19:28:00 crc kubenswrapper[5008]: E0318 19:28:00.168108 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerName="extract-content" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.168119 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerName="extract-content" Mar 18 19:28:00 crc kubenswrapper[5008]: E0318 19:28:00.168135 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerName="registry-server" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.168149 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerName="registry-server" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.171795 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec7caa85-2010-41c3-8bf3-c9fe62bd2e6c" containerName="registry-server" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.171838 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30c8ca8-0c63-42ed-8415-4791cd311fd7" containerName="registry-server" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.171874 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be5f5c0-c2ef-41d2-91ea-ebe8631b2566" containerName="oc" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.172762 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564368-pw579" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.176395 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.176651 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.177717 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564368-pw579"] Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.180366 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.320640 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5wrd\" (UniqueName: \"kubernetes.io/projected/36c2a2b0-1576-4984-b1f3-97799587fccc-kube-api-access-k5wrd\") pod \"auto-csr-approver-29564368-pw579\" (UID: \"36c2a2b0-1576-4984-b1f3-97799587fccc\") " pod="openshift-infra/auto-csr-approver-29564368-pw579" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.422168 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5wrd\" (UniqueName: \"kubernetes.io/projected/36c2a2b0-1576-4984-b1f3-97799587fccc-kube-api-access-k5wrd\") pod \"auto-csr-approver-29564368-pw579\" (UID: \"36c2a2b0-1576-4984-b1f3-97799587fccc\") " pod="openshift-infra/auto-csr-approver-29564368-pw579" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.453628 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5wrd\" (UniqueName: \"kubernetes.io/projected/36c2a2b0-1576-4984-b1f3-97799587fccc-kube-api-access-k5wrd\") pod \"auto-csr-approver-29564368-pw579\" (UID: \"36c2a2b0-1576-4984-b1f3-97799587fccc\") " pod="openshift-infra/auto-csr-approver-29564368-pw579" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.495263 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564368-pw579" Mar 18 19:28:00 crc kubenswrapper[5008]: I0318 19:28:00.957314 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564368-pw579"] Mar 18 19:28:01 crc kubenswrapper[5008]: I0318 19:28:01.198427 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:28:01 crc kubenswrapper[5008]: E0318 19:28:01.198720 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:28:01 crc kubenswrapper[5008]: I0318 19:28:01.313528 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564368-pw579" event={"ID":"36c2a2b0-1576-4984-b1f3-97799587fccc","Type":"ContainerStarted","Data":"07c07895b72cf72530c68403e19f9590e7406c5de639381d4f7eaafbad8f2d54"} Mar 18 19:28:03 crc kubenswrapper[5008]: I0318 19:28:03.338057 5008 generic.go:334] "Generic (PLEG): container finished" podID="36c2a2b0-1576-4984-b1f3-97799587fccc" containerID="941d08f5ef792c7e42bc110c9d8a0641b732abcc5d370590321d68789a5f7d62" exitCode=0 Mar 18 19:28:03 crc kubenswrapper[5008]: I0318 19:28:03.338137 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564368-pw579" event={"ID":"36c2a2b0-1576-4984-b1f3-97799587fccc","Type":"ContainerDied","Data":"941d08f5ef792c7e42bc110c9d8a0641b732abcc5d370590321d68789a5f7d62"} Mar 18 19:28:04 crc kubenswrapper[5008]: I0318 19:28:04.671923 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564368-pw579" Mar 18 19:28:04 crc kubenswrapper[5008]: I0318 19:28:04.791943 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5wrd\" (UniqueName: \"kubernetes.io/projected/36c2a2b0-1576-4984-b1f3-97799587fccc-kube-api-access-k5wrd\") pod \"36c2a2b0-1576-4984-b1f3-97799587fccc\" (UID: \"36c2a2b0-1576-4984-b1f3-97799587fccc\") " Mar 18 19:28:04 crc kubenswrapper[5008]: I0318 19:28:04.797562 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c2a2b0-1576-4984-b1f3-97799587fccc-kube-api-access-k5wrd" (OuterVolumeSpecName: "kube-api-access-k5wrd") pod "36c2a2b0-1576-4984-b1f3-97799587fccc" (UID: "36c2a2b0-1576-4984-b1f3-97799587fccc"). InnerVolumeSpecName "kube-api-access-k5wrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:28:04 crc kubenswrapper[5008]: I0318 19:28:04.893926 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5wrd\" (UniqueName: \"kubernetes.io/projected/36c2a2b0-1576-4984-b1f3-97799587fccc-kube-api-access-k5wrd\") on node \"crc\" DevicePath \"\"" Mar 18 19:28:05 crc kubenswrapper[5008]: I0318 19:28:05.362028 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564368-pw579" event={"ID":"36c2a2b0-1576-4984-b1f3-97799587fccc","Type":"ContainerDied","Data":"07c07895b72cf72530c68403e19f9590e7406c5de639381d4f7eaafbad8f2d54"} Mar 18 19:28:05 crc kubenswrapper[5008]: I0318 19:28:05.362070 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564368-pw579" Mar 18 19:28:05 crc kubenswrapper[5008]: I0318 19:28:05.362074 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07c07895b72cf72530c68403e19f9590e7406c5de639381d4f7eaafbad8f2d54" Mar 18 19:28:05 crc kubenswrapper[5008]: I0318 19:28:05.738623 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564362-sphbx"] Mar 18 19:28:05 crc kubenswrapper[5008]: I0318 19:28:05.743538 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564362-sphbx"] Mar 18 19:28:06 crc kubenswrapper[5008]: I0318 19:28:06.209370 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9d9d1e-744e-4aa0-970c-c17d42363ac4" path="/var/lib/kubelet/pods/5b9d9d1e-744e-4aa0-970c-c17d42363ac4/volumes" Mar 18 19:28:12 crc kubenswrapper[5008]: I0318 19:28:12.199741 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:28:12 crc kubenswrapper[5008]: E0318 19:28:12.201091 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:28:27 crc kubenswrapper[5008]: I0318 19:28:27.198865 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:28:27 crc kubenswrapper[5008]: E0318 19:28:27.199915 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:28:38 crc kubenswrapper[5008]: I0318 19:28:38.198808 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:28:38 crc kubenswrapper[5008]: E0318 19:28:38.200608 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:28:44 crc kubenswrapper[5008]: I0318 19:28:44.735478 5008 scope.go:117] "RemoveContainer" containerID="83962fc7c9e70fcff217792fddca4f6b2eab63e53d979a327a2fbbae8c221b82" Mar 18 19:28:51 crc kubenswrapper[5008]: I0318 19:28:51.198930 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:28:51 crc kubenswrapper[5008]: E0318 19:28:51.199753 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:29:03 crc kubenswrapper[5008]: I0318 19:29:03.198964 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:29:03 crc kubenswrapper[5008]: E0318 19:29:03.200095 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.467596 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 19:29:11 crc kubenswrapper[5008]: E0318 19:29:11.468548 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c2a2b0-1576-4984-b1f3-97799587fccc" containerName="oc" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.468597 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c2a2b0-1576-4984-b1f3-97799587fccc" containerName="oc" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.468872 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c2a2b0-1576-4984-b1f3-97799587fccc" containerName="oc" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.469750 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.472623 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5gqqj" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.481740 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.570130 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54zwj\" (UniqueName: \"kubernetes.io/projected/5fa7fa02-ca96-44a4-b42b-9509fe7d6f14-kube-api-access-54zwj\") pod \"mariadb-copy-data\" (UID: \"5fa7fa02-ca96-44a4-b42b-9509fe7d6f14\") " pod="openstack/mariadb-copy-data" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.570374 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-62913f5c-fe93-4af7-8b39-7454ea7235b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62913f5c-fe93-4af7-8b39-7454ea7235b5\") pod \"mariadb-copy-data\" (UID: \"5fa7fa02-ca96-44a4-b42b-9509fe7d6f14\") " pod="openstack/mariadb-copy-data" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.671978 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54zwj\" (UniqueName: \"kubernetes.io/projected/5fa7fa02-ca96-44a4-b42b-9509fe7d6f14-kube-api-access-54zwj\") pod \"mariadb-copy-data\" (UID: \"5fa7fa02-ca96-44a4-b42b-9509fe7d6f14\") " pod="openstack/mariadb-copy-data" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.672130 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-62913f5c-fe93-4af7-8b39-7454ea7235b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62913f5c-fe93-4af7-8b39-7454ea7235b5\") pod \"mariadb-copy-data\" (UID: \"5fa7fa02-ca96-44a4-b42b-9509fe7d6f14\") " pod="openstack/mariadb-copy-data" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.675941 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.675996 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-62913f5c-fe93-4af7-8b39-7454ea7235b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62913f5c-fe93-4af7-8b39-7454ea7235b5\") pod \"mariadb-copy-data\" (UID: \"5fa7fa02-ca96-44a4-b42b-9509fe7d6f14\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7683aec2ceaa0d98540479f2ea3cf9182dc5623d8f78babca9d949dd94b666aa/globalmount\"" pod="openstack/mariadb-copy-data" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.701128 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54zwj\" (UniqueName: \"kubernetes.io/projected/5fa7fa02-ca96-44a4-b42b-9509fe7d6f14-kube-api-access-54zwj\") pod \"mariadb-copy-data\" (UID: \"5fa7fa02-ca96-44a4-b42b-9509fe7d6f14\") " pod="openstack/mariadb-copy-data" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.709887 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-62913f5c-fe93-4af7-8b39-7454ea7235b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62913f5c-fe93-4af7-8b39-7454ea7235b5\") pod \"mariadb-copy-data\" (UID: \"5fa7fa02-ca96-44a4-b42b-9509fe7d6f14\") " pod="openstack/mariadb-copy-data" Mar 18 19:29:11 crc kubenswrapper[5008]: I0318 19:29:11.808810 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 19:29:12 crc kubenswrapper[5008]: I0318 19:29:12.131326 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 19:29:13 crc kubenswrapper[5008]: I0318 19:29:13.003328 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"5fa7fa02-ca96-44a4-b42b-9509fe7d6f14","Type":"ContainerStarted","Data":"34c4d7ed14e53fbd5c4caf2842ff708f07394d45fd2520ca02dd0314502ec830"} Mar 18 19:29:13 crc kubenswrapper[5008]: I0318 19:29:13.004966 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"5fa7fa02-ca96-44a4-b42b-9509fe7d6f14","Type":"ContainerStarted","Data":"5d35ee8265f5521d6b8f961630fea1326e664baa17b1d804dfe39f941ee1bce5"} Mar 18 19:29:16 crc kubenswrapper[5008]: I0318 19:29:16.016742 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=6.016712482 podStartE2EDuration="6.016712482s" podCreationTimestamp="2026-03-18 19:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:29:13.029438037 +0000 UTC m=+5209.548911116" watchObservedRunningTime="2026-03-18 19:29:16.016712482 +0000 UTC m=+5212.536185601" Mar 18 19:29:16 crc kubenswrapper[5008]: I0318 19:29:16.019595 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:16 crc kubenswrapper[5008]: I0318 19:29:16.021450 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:29:16 crc kubenswrapper[5008]: I0318 19:29:16.034454 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:16 crc kubenswrapper[5008]: I0318 19:29:16.146489 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94prq\" (UniqueName: \"kubernetes.io/projected/bb00bb10-3560-4933-a3f8-5b32bb379245-kube-api-access-94prq\") pod \"mariadb-client\" (UID: \"bb00bb10-3560-4933-a3f8-5b32bb379245\") " pod="openstack/mariadb-client" Mar 18 19:29:16 crc kubenswrapper[5008]: I0318 19:29:16.249260 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94prq\" (UniqueName: \"kubernetes.io/projected/bb00bb10-3560-4933-a3f8-5b32bb379245-kube-api-access-94prq\") pod \"mariadb-client\" (UID: \"bb00bb10-3560-4933-a3f8-5b32bb379245\") " pod="openstack/mariadb-client" Mar 18 19:29:16 crc kubenswrapper[5008]: I0318 19:29:16.270190 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94prq\" (UniqueName: \"kubernetes.io/projected/bb00bb10-3560-4933-a3f8-5b32bb379245-kube-api-access-94prq\") pod \"mariadb-client\" (UID: \"bb00bb10-3560-4933-a3f8-5b32bb379245\") " pod="openstack/mariadb-client" Mar 18 19:29:16 crc kubenswrapper[5008]: I0318 19:29:16.353447 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:29:16 crc kubenswrapper[5008]: I0318 19:29:16.817710 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:17 crc kubenswrapper[5008]: I0318 19:29:17.042808 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bb00bb10-3560-4933-a3f8-5b32bb379245","Type":"ContainerStarted","Data":"d24911c04aa8db58d1e518078a31609f846fe9efc7614bf06a2f424d4c5b858e"} Mar 18 19:29:17 crc kubenswrapper[5008]: I0318 19:29:17.042891 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bb00bb10-3560-4933-a3f8-5b32bb379245","Type":"ContainerStarted","Data":"0738880d42e81ef392268df444addbcaf9c4341109461749bef45ea4728d9fd1"} Mar 18 19:29:17 crc kubenswrapper[5008]: I0318 19:29:17.054991 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.054976049 podStartE2EDuration="2.054976049s" podCreationTimestamp="2026-03-18 19:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:29:17.053644694 +0000 UTC m=+5213.573117773" watchObservedRunningTime="2026-03-18 19:29:17.054976049 +0000 UTC m=+5213.574449128" Mar 18 19:29:17 crc kubenswrapper[5008]: I0318 19:29:17.104141 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_bb00bb10-3560-4933-a3f8-5b32bb379245/mariadb-client/0.log" Mar 18 19:29:17 crc kubenswrapper[5008]: I0318 19:29:17.199034 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:29:17 crc kubenswrapper[5008]: E0318 19:29:17.199428 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:29:18 crc kubenswrapper[5008]: I0318 19:29:18.051983 5008 generic.go:334] "Generic (PLEG): container finished" podID="bb00bb10-3560-4933-a3f8-5b32bb379245" containerID="d24911c04aa8db58d1e518078a31609f846fe9efc7614bf06a2f424d4c5b858e" exitCode=0 Mar 18 19:29:18 crc kubenswrapper[5008]: I0318 19:29:18.052077 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"bb00bb10-3560-4933-a3f8-5b32bb379245","Type":"ContainerDied","Data":"d24911c04aa8db58d1e518078a31609f846fe9efc7614bf06a2f424d4c5b858e"} Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.440617 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.485038 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.489712 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.506728 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94prq\" (UniqueName: \"kubernetes.io/projected/bb00bb10-3560-4933-a3f8-5b32bb379245-kube-api-access-94prq\") pod \"bb00bb10-3560-4933-a3f8-5b32bb379245\" (UID: \"bb00bb10-3560-4933-a3f8-5b32bb379245\") " Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.511683 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb00bb10-3560-4933-a3f8-5b32bb379245-kube-api-access-94prq" (OuterVolumeSpecName: "kube-api-access-94prq") pod "bb00bb10-3560-4933-a3f8-5b32bb379245" (UID: "bb00bb10-3560-4933-a3f8-5b32bb379245"). InnerVolumeSpecName "kube-api-access-94prq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.608790 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94prq\" (UniqueName: \"kubernetes.io/projected/bb00bb10-3560-4933-a3f8-5b32bb379245-kube-api-access-94prq\") on node \"crc\" DevicePath \"\"" Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.678001 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:19 crc kubenswrapper[5008]: E0318 19:29:19.678976 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb00bb10-3560-4933-a3f8-5b32bb379245" containerName="mariadb-client" Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.678999 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb00bb10-3560-4933-a3f8-5b32bb379245" containerName="mariadb-client" Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.679221 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb00bb10-3560-4933-a3f8-5b32bb379245" containerName="mariadb-client" Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.681159 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.703402 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.710240 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvr2\" (UniqueName: \"kubernetes.io/projected/f24eda31-4b19-4f84-85e1-830fb18e8c08-kube-api-access-fmvr2\") pod \"mariadb-client\" (UID: \"f24eda31-4b19-4f84-85e1-830fb18e8c08\") " pod="openstack/mariadb-client" Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.812582 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvr2\" (UniqueName: \"kubernetes.io/projected/f24eda31-4b19-4f84-85e1-830fb18e8c08-kube-api-access-fmvr2\") pod \"mariadb-client\" (UID: \"f24eda31-4b19-4f84-85e1-830fb18e8c08\") " pod="openstack/mariadb-client" Mar 18 19:29:19 crc kubenswrapper[5008]: I0318 19:29:19.832511 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvr2\" (UniqueName: \"kubernetes.io/projected/f24eda31-4b19-4f84-85e1-830fb18e8c08-kube-api-access-fmvr2\") pod \"mariadb-client\" (UID: \"f24eda31-4b19-4f84-85e1-830fb18e8c08\") " pod="openstack/mariadb-client" Mar 18 19:29:20 crc kubenswrapper[5008]: I0318 19:29:20.004254 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:29:20 crc kubenswrapper[5008]: I0318 19:29:20.071512 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0738880d42e81ef392268df444addbcaf9c4341109461749bef45ea4728d9fd1" Mar 18 19:29:20 crc kubenswrapper[5008]: I0318 19:29:20.071649 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:29:20 crc kubenswrapper[5008]: I0318 19:29:20.106248 5008 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="bb00bb10-3560-4933-a3f8-5b32bb379245" podUID="f24eda31-4b19-4f84-85e1-830fb18e8c08" Mar 18 19:29:20 crc kubenswrapper[5008]: I0318 19:29:20.210633 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb00bb10-3560-4933-a3f8-5b32bb379245" path="/var/lib/kubelet/pods/bb00bb10-3560-4933-a3f8-5b32bb379245/volumes" Mar 18 19:29:20 crc kubenswrapper[5008]: I0318 19:29:20.480142 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:21 crc kubenswrapper[5008]: I0318 19:29:21.084118 5008 generic.go:334] "Generic (PLEG): container finished" podID="f24eda31-4b19-4f84-85e1-830fb18e8c08" containerID="7580b00b3d1992c65ce9d70fcf8ee5f49fc6d055bf398aa4aeb6d84c48e8ffce" exitCode=0 Mar 18 19:29:21 crc kubenswrapper[5008]: I0318 19:29:21.084193 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f24eda31-4b19-4f84-85e1-830fb18e8c08","Type":"ContainerDied","Data":"7580b00b3d1992c65ce9d70fcf8ee5f49fc6d055bf398aa4aeb6d84c48e8ffce"} Mar 18 19:29:21 crc kubenswrapper[5008]: I0318 19:29:21.084646 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f24eda31-4b19-4f84-85e1-830fb18e8c08","Type":"ContainerStarted","Data":"346a3ed5bdc3c13b8d7cf7a876deeb890f6b3b52263b6ca356ac3e75293c0692"} Mar 18 19:29:22 crc kubenswrapper[5008]: I0318 19:29:22.464282 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:29:22 crc kubenswrapper[5008]: I0318 19:29:22.531538 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_f24eda31-4b19-4f84-85e1-830fb18e8c08/mariadb-client/0.log" Mar 18 19:29:22 crc kubenswrapper[5008]: I0318 19:29:22.555361 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmvr2\" (UniqueName: \"kubernetes.io/projected/f24eda31-4b19-4f84-85e1-830fb18e8c08-kube-api-access-fmvr2\") pod \"f24eda31-4b19-4f84-85e1-830fb18e8c08\" (UID: \"f24eda31-4b19-4f84-85e1-830fb18e8c08\") " Mar 18 19:29:22 crc kubenswrapper[5008]: I0318 19:29:22.559923 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:22 crc kubenswrapper[5008]: I0318 19:29:22.562952 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24eda31-4b19-4f84-85e1-830fb18e8c08-kube-api-access-fmvr2" (OuterVolumeSpecName: "kube-api-access-fmvr2") pod "f24eda31-4b19-4f84-85e1-830fb18e8c08" (UID: "f24eda31-4b19-4f84-85e1-830fb18e8c08"). InnerVolumeSpecName "kube-api-access-fmvr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:29:22 crc kubenswrapper[5008]: I0318 19:29:22.566818 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 19:29:22 crc kubenswrapper[5008]: I0318 19:29:22.656730 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmvr2\" (UniqueName: \"kubernetes.io/projected/f24eda31-4b19-4f84-85e1-830fb18e8c08-kube-api-access-fmvr2\") on node \"crc\" DevicePath \"\"" Mar 18 19:29:23 crc kubenswrapper[5008]: I0318 19:29:23.105642 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="346a3ed5bdc3c13b8d7cf7a876deeb890f6b3b52263b6ca356ac3e75293c0692" Mar 18 19:29:23 crc kubenswrapper[5008]: I0318 19:29:23.105726 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 19:29:24 crc kubenswrapper[5008]: I0318 19:29:24.215220 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24eda31-4b19-4f84-85e1-830fb18e8c08" path="/var/lib/kubelet/pods/f24eda31-4b19-4f84-85e1-830fb18e8c08/volumes" Mar 18 19:29:30 crc kubenswrapper[5008]: I0318 19:29:30.198967 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:29:30 crc kubenswrapper[5008]: E0318 19:29:30.200316 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:29:42 crc kubenswrapper[5008]: I0318 19:29:42.198464 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:29:42 crc kubenswrapper[5008]: E0318 19:29:42.199182 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:29:44 crc kubenswrapper[5008]: I0318 19:29:44.840949 5008 scope.go:117] "RemoveContainer" containerID="2a2ddbff83bf852ae8f892d67c0a07b59310a63c97ac11a8397fab68c1ad68e7" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.784191 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 19:29:56 crc kubenswrapper[5008]: E0318 19:29:56.785144 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24eda31-4b19-4f84-85e1-830fb18e8c08" containerName="mariadb-client" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.785162 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24eda31-4b19-4f84-85e1-830fb18e8c08" containerName="mariadb-client" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.785391 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24eda31-4b19-4f84-85e1-830fb18e8c08" containerName="mariadb-client" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.786677 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.789003 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.789344 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.789506 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6cxzt" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.814202 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.815515 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.823424 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.824663 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.838034 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.858358 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.866236 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.885282 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1fa7681-037d-42ad-bded-ab849fd5541b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.885326 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1fa7681-037d-42ad-bded-ab849fd5541b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.885379 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rldkt\" (UniqueName: \"kubernetes.io/projected/a1fa7681-037d-42ad-bded-ab849fd5541b-kube-api-access-rldkt\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.885531 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a176d348-b48c-42d7-8ece-14a94017793f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a176d348-b48c-42d7-8ece-14a94017793f\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.885676 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1fa7681-037d-42ad-bded-ab849fd5541b-config\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.885790 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1fa7681-037d-42ad-bded-ab849fd5541b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.977226 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.978691 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.980223 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.980368 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.980472 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-tggbj" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987575 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3796e669-b541-4e2c-b876-006a019c5d9a-config\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987621 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs888\" (UniqueName: \"kubernetes.io/projected/3e8645c7-3653-4517-b1c9-2be63877356f-kube-api-access-cs888\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987640 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e8645c7-3653-4517-b1c9-2be63877356f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987660 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-347ced09-ffff-4a31-989a-58a0b9879da3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-347ced09-ffff-4a31-989a-58a0b9879da3\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987685 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a176d348-b48c-42d7-8ece-14a94017793f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a176d348-b48c-42d7-8ece-14a94017793f\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987714 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1fa7681-037d-42ad-bded-ab849fd5541b-config\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987740 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e8645c7-3653-4517-b1c9-2be63877356f-config\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987765 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3796e669-b541-4e2c-b876-006a019c5d9a-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987781 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1fa7681-037d-42ad-bded-ab849fd5541b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987801 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3796e669-b541-4e2c-b876-006a019c5d9a-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987820 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1fa7681-037d-42ad-bded-ab849fd5541b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987833 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1fa7681-037d-42ad-bded-ab849fd5541b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987854 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8645c7-3653-4517-b1c9-2be63877356f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987870 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h58mt\" (UniqueName: \"kubernetes.io/projected/3796e669-b541-4e2c-b876-006a019c5d9a-kube-api-access-h58mt\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987894 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e8645c7-3653-4517-b1c9-2be63877356f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987913 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3796e669-b541-4e2c-b876-006a019c5d9a-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.987937 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rldkt\" (UniqueName: \"kubernetes.io/projected/a1fa7681-037d-42ad-bded-ab849fd5541b-kube-api-access-rldkt\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.988745 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d4b62cd3-cf94-4e4e-9e3c-f48f4434cd81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4b62cd3-cf94-4e4e-9e3c-f48f4434cd81\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.989096 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1fa7681-037d-42ad-bded-ab849fd5541b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.989248 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1fa7681-037d-42ad-bded-ab849fd5541b-config\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.989473 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1fa7681-037d-42ad-bded-ab849fd5541b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.989507 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 19:29:56 crc kubenswrapper[5008]: I0318 19:29:56.997165 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1fa7681-037d-42ad-bded-ab849fd5541b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.008245 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.009315 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.017365 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.017396 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a176d348-b48c-42d7-8ece-14a94017793f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a176d348-b48c-42d7-8ece-14a94017793f\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/86b0ebd7e9903681024da10f64deefbf80b6f3f7252c2df6bcee4dd18c952f2d/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.017627 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rldkt\" (UniqueName: \"kubernetes.io/projected/a1fa7681-037d-42ad-bded-ab849fd5541b-kube-api-access-rldkt\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.024033 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.025290 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.034400 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.040605 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.049868 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a176d348-b48c-42d7-8ece-14a94017793f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a176d348-b48c-42d7-8ece-14a94017793f\") pod \"ovsdbserver-nb-0\" (UID: \"a1fa7681-037d-42ad-bded-ab849fd5541b\") " pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.089950 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cdffdc29-be26-4cfb-9b4a-953f20ba3725\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cdffdc29-be26-4cfb-9b4a-953f20ba3725\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090015 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e8645c7-3653-4517-b1c9-2be63877356f-config\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090038 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090081 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3796e669-b541-4e2c-b876-006a019c5d9a-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090106 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090136 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-93d6c9ef-e05b-4012-813f-1c495e57955e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93d6c9ef-e05b-4012-813f-1c495e57955e\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090161 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa59857-10dc-408d-9f91-3ed06b022f0c-config\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090190 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3796e669-b541-4e2c-b876-006a019c5d9a-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090225 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8645c7-3653-4517-b1c9-2be63877356f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090249 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h58mt\" (UniqueName: \"kubernetes.io/projected/3796e669-b541-4e2c-b876-006a019c5d9a-kube-api-access-h58mt\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090272 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4mph\" (UniqueName: \"kubernetes.io/projected/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-kube-api-access-k4mph\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090293 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0aa59857-10dc-408d-9f91-3ed06b022f0c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090315 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1028aa1c-2139-49f0-8ed6-4187186bc1c9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090340 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1028aa1c-2139-49f0-8ed6-4187186bc1c9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090373 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e8645c7-3653-4517-b1c9-2be63877356f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090400 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3796e669-b541-4e2c-b876-006a019c5d9a-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090435 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-config\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090465 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d4b62cd3-cf94-4e4e-9e3c-f48f4434cd81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4b62cd3-cf94-4e4e-9e3c-f48f4434cd81\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090491 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa59857-10dc-408d-9f91-3ed06b022f0c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090514 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090539 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3796e669-b541-4e2c-b876-006a019c5d9a-config\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090580 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa59857-10dc-408d-9f91-3ed06b022f0c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090605 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs888\" (UniqueName: \"kubernetes.io/projected/3e8645c7-3653-4517-b1c9-2be63877356f-kube-api-access-cs888\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090626 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-242r4\" (UniqueName: \"kubernetes.io/projected/1028aa1c-2139-49f0-8ed6-4187186bc1c9-kube-api-access-242r4\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090647 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e8645c7-3653-4517-b1c9-2be63877356f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090671 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0a66e76f-b5c7-4b54-a394-40721bfea858\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a66e76f-b5c7-4b54-a394-40721bfea858\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090691 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1028aa1c-2139-49f0-8ed6-4187186bc1c9-config\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090719 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-347ced09-ffff-4a31-989a-58a0b9879da3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-347ced09-ffff-4a31-989a-58a0b9879da3\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090745 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1028aa1c-2139-49f0-8ed6-4187186bc1c9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.090777 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sbjq\" (UniqueName: \"kubernetes.io/projected/0aa59857-10dc-408d-9f91-3ed06b022f0c-kube-api-access-6sbjq\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.091441 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e8645c7-3653-4517-b1c9-2be63877356f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.091595 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3796e669-b541-4e2c-b876-006a019c5d9a-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.092104 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3796e669-b541-4e2c-b876-006a019c5d9a-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.092164 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3796e669-b541-4e2c-b876-006a019c5d9a-config\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.092169 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e8645c7-3653-4517-b1c9-2be63877356f-config\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.093704 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.093730 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-347ced09-ffff-4a31-989a-58a0b9879da3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-347ced09-ffff-4a31-989a-58a0b9879da3\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4689780408b88e6e6c2ec6972b9aa4ba6e6035aa1231d65e4662716125623b3f/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.094308 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.094354 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d4b62cd3-cf94-4e4e-9e3c-f48f4434cd81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4b62cd3-cf94-4e4e-9e3c-f48f4434cd81\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ff261be1ce88970d23841cb29204fc8ccae32cfa883d7a3eef0fced7bb949564/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.095096 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8645c7-3653-4517-b1c9-2be63877356f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.096067 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3796e669-b541-4e2c-b876-006a019c5d9a-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.096672 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e8645c7-3653-4517-b1c9-2be63877356f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.105478 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.109354 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h58mt\" (UniqueName: \"kubernetes.io/projected/3796e669-b541-4e2c-b876-006a019c5d9a-kube-api-access-h58mt\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.114409 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs888\" (UniqueName: \"kubernetes.io/projected/3e8645c7-3653-4517-b1c9-2be63877356f-kube-api-access-cs888\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.115575 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-347ced09-ffff-4a31-989a-58a0b9879da3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-347ced09-ffff-4a31-989a-58a0b9879da3\") pod \"ovsdbserver-nb-1\" (UID: \"3796e669-b541-4e2c-b876-006a019c5d9a\") " pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.136822 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d4b62cd3-cf94-4e4e-9e3c-f48f4434cd81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4b62cd3-cf94-4e4e-9e3c-f48f4434cd81\") pod \"ovsdbserver-nb-2\" (UID: \"3e8645c7-3653-4517-b1c9-2be63877356f\") " pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.140362 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.151732 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.191834 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1028aa1c-2139-49f0-8ed6-4187186bc1c9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.191886 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sbjq\" (UniqueName: \"kubernetes.io/projected/0aa59857-10dc-408d-9f91-3ed06b022f0c-kube-api-access-6sbjq\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.191925 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cdffdc29-be26-4cfb-9b4a-953f20ba3725\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cdffdc29-be26-4cfb-9b4a-953f20ba3725\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.191945 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.191971 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.191989 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-93d6c9ef-e05b-4012-813f-1c495e57955e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93d6c9ef-e05b-4012-813f-1c495e57955e\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192008 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa59857-10dc-408d-9f91-3ed06b022f0c-config\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192043 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4mph\" (UniqueName: \"kubernetes.io/projected/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-kube-api-access-k4mph\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192058 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0aa59857-10dc-408d-9f91-3ed06b022f0c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192082 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1028aa1c-2139-49f0-8ed6-4187186bc1c9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192103 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1028aa1c-2139-49f0-8ed6-4187186bc1c9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192141 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-config\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192161 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa59857-10dc-408d-9f91-3ed06b022f0c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192178 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192203 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa59857-10dc-408d-9f91-3ed06b022f0c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192221 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-242r4\" (UniqueName: \"kubernetes.io/projected/1028aa1c-2139-49f0-8ed6-4187186bc1c9-kube-api-access-242r4\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192237 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0a66e76f-b5c7-4b54-a394-40721bfea858\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a66e76f-b5c7-4b54-a394-40721bfea858\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192253 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1028aa1c-2139-49f0-8ed6-4187186bc1c9-config\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.192326 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1028aa1c-2139-49f0-8ed6-4187186bc1c9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.193185 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1028aa1c-2139-49f0-8ed6-4187186bc1c9-config\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.193605 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.194068 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0aa59857-10dc-408d-9f91-3ed06b022f0c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.194587 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0aa59857-10dc-408d-9f91-3ed06b022f0c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.196046 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-config\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.217464 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa59857-10dc-408d-9f91-3ed06b022f0c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.217577 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa59857-10dc-408d-9f91-3ed06b022f0c-config\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.219453 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.219502 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0a66e76f-b5c7-4b54-a394-40721bfea858\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a66e76f-b5c7-4b54-a394-40721bfea858\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b56de5fb437e66073d7dc2f1197f684e8f1974c08ab67b4edda1eae90ffef77a/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.223434 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1028aa1c-2139-49f0-8ed6-4187186bc1c9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.223439 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1028aa1c-2139-49f0-8ed6-4187186bc1c9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.223664 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.223705 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cdffdc29-be26-4cfb-9b4a-953f20ba3725\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cdffdc29-be26-4cfb-9b4a-953f20ba3725\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/070db5796673b3a2cfdcc15f94450dcdf699a9e8265bffa184a1a96601b4a8e4/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.224464 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.224993 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.225027 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-93d6c9ef-e05b-4012-813f-1c495e57955e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93d6c9ef-e05b-4012-813f-1c495e57955e\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/17472a1a7ae62fa0e26f5d65d1afe204403eb52563c4c2d45b35260fe38ec3c2/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.229975 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:29:57 crc kubenswrapper[5008]: E0318 19:29:57.237266 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.253670 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.258460 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sbjq\" (UniqueName: \"kubernetes.io/projected/0aa59857-10dc-408d-9f91-3ed06b022f0c-kube-api-access-6sbjq\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.259448 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-242r4\" (UniqueName: \"kubernetes.io/projected/1028aa1c-2139-49f0-8ed6-4187186bc1c9-kube-api-access-242r4\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.267621 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4mph\" (UniqueName: \"kubernetes.io/projected/28bb0d68-09a7-4c21-a62f-b7f0687e22c4-kube-api-access-k4mph\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.280388 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-93d6c9ef-e05b-4012-813f-1c495e57955e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93d6c9ef-e05b-4012-813f-1c495e57955e\") pod \"ovsdbserver-sb-0\" (UID: \"0aa59857-10dc-408d-9f91-3ed06b022f0c\") " pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.282447 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cdffdc29-be26-4cfb-9b4a-953f20ba3725\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cdffdc29-be26-4cfb-9b4a-953f20ba3725\") pod \"ovsdbserver-sb-1\" (UID: \"28bb0d68-09a7-4c21-a62f-b7f0687e22c4\") " pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.302306 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0a66e76f-b5c7-4b54-a394-40721bfea858\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a66e76f-b5c7-4b54-a394-40721bfea858\") pod \"ovsdbserver-sb-2\" (UID: \"1028aa1c-2139-49f0-8ed6-4187186bc1c9\") " pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.370656 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.384141 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.390880 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.611627 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 19:29:57 crc kubenswrapper[5008]: I0318 19:29:57.727512 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 19:29:57 crc kubenswrapper[5008]: W0318 19:29:57.728750 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e8645c7_3653_4517_b1c9_2be63877356f.slice/crio-867f576c1f627edea51780aad6aacecc5b7fd75abc2efab4ced9ae896b04ea2a WatchSource:0}: Error finding container 867f576c1f627edea51780aad6aacecc5b7fd75abc2efab4ced9ae896b04ea2a: Status 404 returned error can't find the container with id 867f576c1f627edea51780aad6aacecc5b7fd75abc2efab4ced9ae896b04ea2a Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.046355 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.442904 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a1fa7681-037d-42ad-bded-ab849fd5541b","Type":"ContainerStarted","Data":"7f63e3ff0afc0002e4de5d3d4330a32900b08a878c3504d439af52c0520ac136"} Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.443152 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a1fa7681-037d-42ad-bded-ab849fd5541b","Type":"ContainerStarted","Data":"758ba84ec48e5b2e2e0bdf0da6409456611739e768ed1589986bf28df21b1e8f"} Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.443163 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a1fa7681-037d-42ad-bded-ab849fd5541b","Type":"ContainerStarted","Data":"77a8b249a9583ad958882fb90b2de2c60c69d00cca3d98e1e985094d8c5d9563"} Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.444519 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3e8645c7-3653-4517-b1c9-2be63877356f","Type":"ContainerStarted","Data":"1359721691d404df1f35ef052aebc8c88d2c8d028b72dd458b6438f95751c12c"} Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.444542 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3e8645c7-3653-4517-b1c9-2be63877356f","Type":"ContainerStarted","Data":"7b18858595fdcf99609a7de55b4f87112c9838b57fe872275ff6c2f1498e34f1"} Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.444551 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3e8645c7-3653-4517-b1c9-2be63877356f","Type":"ContainerStarted","Data":"867f576c1f627edea51780aad6aacecc5b7fd75abc2efab4ced9ae896b04ea2a"} Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.446367 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"1028aa1c-2139-49f0-8ed6-4187186bc1c9","Type":"ContainerStarted","Data":"f37213ee08cc91871d6444ef992a8855dae559231dbbb66e6a7838b8582c5988"} Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.446392 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"1028aa1c-2139-49f0-8ed6-4187186bc1c9","Type":"ContainerStarted","Data":"fe3e14e691fd900d0fa745d19462fd72999a246a5a50a4c0742aaf7e79349cea"} Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.446401 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"1028aa1c-2139-49f0-8ed6-4187186bc1c9","Type":"ContainerStarted","Data":"41e13901ac99126f79dd447c632443f15795e2aea43a612a9a6ea581b8e2a71c"} Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.464246 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.4642277 podStartE2EDuration="3.4642277s" podCreationTimestamp="2026-03-18 19:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:29:58.458344516 +0000 UTC m=+5254.977817595" watchObservedRunningTime="2026-03-18 19:29:58.4642277 +0000 UTC m=+5254.983700779" Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.477220 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.4771998 podStartE2EDuration="3.4771998s" podCreationTimestamp="2026-03-18 19:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:29:58.474711735 +0000 UTC m=+5254.994184814" watchObservedRunningTime="2026-03-18 19:29:58.4771998 +0000 UTC m=+5254.996672879" Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.493843 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.493826596 podStartE2EDuration="3.493826596s" podCreationTimestamp="2026-03-18 19:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:29:58.49132311 +0000 UTC m=+5255.010796189" watchObservedRunningTime="2026-03-18 19:29:58.493826596 +0000 UTC m=+5255.013299665" Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.530267 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 19:29:58 crc kubenswrapper[5008]: W0318 19:29:58.534465 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aa59857_10dc_408d_9f91_3ed06b022f0c.slice/crio-ebfbc7c7b2df89fa6cbf35dc420c7227e5a19f59e874e114be4ad734565f1f20 WatchSource:0}: Error finding container ebfbc7c7b2df89fa6cbf35dc420c7227e5a19f59e874e114be4ad734565f1f20: Status 404 returned error can't find the container with id ebfbc7c7b2df89fa6cbf35dc420c7227e5a19f59e874e114be4ad734565f1f20 Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.636260 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 19:29:58 crc kubenswrapper[5008]: W0318 19:29:58.637415 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3796e669_b541_4e2c_b876_006a019c5d9a.slice/crio-6bd8d833f81509a0ea2801a7b5fbc3242383982da72300711d454893f0f27c6f WatchSource:0}: Error finding container 6bd8d833f81509a0ea2801a7b5fbc3242383982da72300711d454893f0f27c6f: Status 404 returned error can't find the container with id 6bd8d833f81509a0ea2801a7b5fbc3242383982da72300711d454893f0f27c6f Mar 18 19:29:58 crc kubenswrapper[5008]: W0318 19:29:58.935106 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28bb0d68_09a7_4c21_a62f_b7f0687e22c4.slice/crio-97df8261597a80af532555ad1353f9f12d00b6d297acd4249bb23347b0acf346 WatchSource:0}: Error finding container 97df8261597a80af532555ad1353f9f12d00b6d297acd4249bb23347b0acf346: Status 404 returned error can't find the container with id 97df8261597a80af532555ad1353f9f12d00b6d297acd4249bb23347b0acf346 Mar 18 19:29:58 crc kubenswrapper[5008]: I0318 19:29:58.935703 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.457351 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0aa59857-10dc-408d-9f91-3ed06b022f0c","Type":"ContainerStarted","Data":"97c36944eacf8f17900c09ac49dd287dbf53aaaba53260a795de5bacb7a9e71a"} Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.457391 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0aa59857-10dc-408d-9f91-3ed06b022f0c","Type":"ContainerStarted","Data":"592fe65899edcf25e08eb749a5bb12acca40d3eac6d1b3b8c9bc97f6e9332a65"} Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.457400 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0aa59857-10dc-408d-9f91-3ed06b022f0c","Type":"ContainerStarted","Data":"ebfbc7c7b2df89fa6cbf35dc420c7227e5a19f59e874e114be4ad734565f1f20"} Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.461339 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3796e669-b541-4e2c-b876-006a019c5d9a","Type":"ContainerStarted","Data":"a66d2cf83186f7fdc3d5b16868c5cb59d55b75b75a851bb633a5203bd658e8ab"} Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.461424 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3796e669-b541-4e2c-b876-006a019c5d9a","Type":"ContainerStarted","Data":"d5f7405ea601979439bb565928c1bd015f38d46a839460935cb0871c11c337ad"} Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.461451 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3796e669-b541-4e2c-b876-006a019c5d9a","Type":"ContainerStarted","Data":"6bd8d833f81509a0ea2801a7b5fbc3242383982da72300711d454893f0f27c6f"} Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.464218 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"28bb0d68-09a7-4c21-a62f-b7f0687e22c4","Type":"ContainerStarted","Data":"11abddc5fc6734f00372dc3384315a8fdc9bddccd7cdcf191989efe927e1070b"} Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.464293 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"28bb0d68-09a7-4c21-a62f-b7f0687e22c4","Type":"ContainerStarted","Data":"7961a4dd82fd531dd5dc2d3d43cf318771473a32008e5c360795d92edb9aad9d"} Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.464322 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"28bb0d68-09a7-4c21-a62f-b7f0687e22c4","Type":"ContainerStarted","Data":"97df8261597a80af532555ad1353f9f12d00b6d297acd4249bb23347b0acf346"} Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.479166 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.479149994 podStartE2EDuration="4.479149994s" podCreationTimestamp="2026-03-18 19:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:29:59.472637083 +0000 UTC m=+5255.992110152" watchObservedRunningTime="2026-03-18 19:29:59.479149994 +0000 UTC m=+5255.998623073" Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.497337 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.49731644 podStartE2EDuration="4.49731644s" podCreationTimestamp="2026-03-18 19:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:29:59.493234223 +0000 UTC m=+5256.012707302" watchObservedRunningTime="2026-03-18 19:29:59.49731644 +0000 UTC m=+5256.016789519" Mar 18 19:29:59 crc kubenswrapper[5008]: I0318 19:29:59.518066 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.518047663 podStartE2EDuration="4.518047663s" podCreationTimestamp="2026-03-18 19:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:29:59.51029946 +0000 UTC m=+5256.029772539" watchObservedRunningTime="2026-03-18 19:29:59.518047663 +0000 UTC m=+5256.037520742" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.106262 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.141011 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.141371 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564370-d58j9"] Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.143630 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564370-d58j9" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.148287 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.148299 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.148460 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.150789 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr"] Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.151764 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.152305 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.153180 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.153417 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.161147 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564370-d58j9"] Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.193419 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr"] Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.195183 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.214354 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.254615 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc5745ac-0808-4b1c-a447-9597236492b0-secret-volume\") pod \"collect-profiles-29564370-zpbkr\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.254696 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkrc\" (UniqueName: \"kubernetes.io/projected/1be5fd66-baab-4624-a51e-17d49300c05a-kube-api-access-nqkrc\") pod \"auto-csr-approver-29564370-d58j9\" (UID: \"1be5fd66-baab-4624-a51e-17d49300c05a\") " pod="openshift-infra/auto-csr-approver-29564370-d58j9" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.254959 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhnfc\" (UniqueName: \"kubernetes.io/projected/cc5745ac-0808-4b1c-a447-9597236492b0-kube-api-access-hhnfc\") pod \"collect-profiles-29564370-zpbkr\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.255101 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc5745ac-0808-4b1c-a447-9597236492b0-config-volume\") pod \"collect-profiles-29564370-zpbkr\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.355996 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc5745ac-0808-4b1c-a447-9597236492b0-secret-volume\") pod \"collect-profiles-29564370-zpbkr\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.356115 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkrc\" (UniqueName: \"kubernetes.io/projected/1be5fd66-baab-4624-a51e-17d49300c05a-kube-api-access-nqkrc\") pod \"auto-csr-approver-29564370-d58j9\" (UID: \"1be5fd66-baab-4624-a51e-17d49300c05a\") " pod="openshift-infra/auto-csr-approver-29564370-d58j9" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.356282 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhnfc\" (UniqueName: \"kubernetes.io/projected/cc5745ac-0808-4b1c-a447-9597236492b0-kube-api-access-hhnfc\") pod \"collect-profiles-29564370-zpbkr\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.356363 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc5745ac-0808-4b1c-a447-9597236492b0-config-volume\") pod \"collect-profiles-29564370-zpbkr\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.357240 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc5745ac-0808-4b1c-a447-9597236492b0-config-volume\") pod \"collect-profiles-29564370-zpbkr\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.362945 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc5745ac-0808-4b1c-a447-9597236492b0-secret-volume\") pod \"collect-profiles-29564370-zpbkr\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.371307 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.377320 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhnfc\" (UniqueName: \"kubernetes.io/projected/cc5745ac-0808-4b1c-a447-9597236492b0-kube-api-access-hhnfc\") pod \"collect-profiles-29564370-zpbkr\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.383381 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkrc\" (UniqueName: \"kubernetes.io/projected/1be5fd66-baab-4624-a51e-17d49300c05a-kube-api-access-nqkrc\") pod \"auto-csr-approver-29564370-d58j9\" (UID: \"1be5fd66-baab-4624-a51e-17d49300c05a\") " pod="openshift-infra/auto-csr-approver-29564370-d58j9" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.385216 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.391489 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.471032 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.471073 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.487966 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564370-d58j9" Mar 18 19:30:00 crc kubenswrapper[5008]: I0318 19:30:00.507965 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:01 crc kubenswrapper[5008]: I0318 19:30:01.062052 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr"] Mar 18 19:30:01 crc kubenswrapper[5008]: I0318 19:30:01.070362 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:30:01 crc kubenswrapper[5008]: I0318 19:30:01.071778 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564370-d58j9"] Mar 18 19:30:01 crc kubenswrapper[5008]: I0318 19:30:01.478237 5008 generic.go:334] "Generic (PLEG): container finished" podID="cc5745ac-0808-4b1c-a447-9597236492b0" containerID="3340ad1791d680fdca6c1d3130e36022857a86f18377b3e50769f39f4ab5cff6" exitCode=0 Mar 18 19:30:01 crc kubenswrapper[5008]: I0318 19:30:01.478286 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" event={"ID":"cc5745ac-0808-4b1c-a447-9597236492b0","Type":"ContainerDied","Data":"3340ad1791d680fdca6c1d3130e36022857a86f18377b3e50769f39f4ab5cff6"} Mar 18 19:30:01 crc kubenswrapper[5008]: I0318 19:30:01.478577 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" event={"ID":"cc5745ac-0808-4b1c-a447-9597236492b0","Type":"ContainerStarted","Data":"c8180e67c479e5049bc2eec507e3447121483cede84cf7d8e9a1d98ea91ba4be"} Mar 18 19:30:01 crc kubenswrapper[5008]: I0318 19:30:01.480183 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564370-d58j9" event={"ID":"1be5fd66-baab-4624-a51e-17d49300c05a","Type":"ContainerStarted","Data":"ae445ba7fa070e20cfc29005df1ebebbe7c177f65eb01ad474cdcaad07aa03a9"} Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.141758 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.173282 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.222955 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.371731 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.387679 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.391695 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.406978 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-649dc66575-d5m6l"] Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.408965 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.413447 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649dc66575-d5m6l"] Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.414422 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.490165 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-dns-svc\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.490196 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-ovsdbserver-nb\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.490220 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8wpq\" (UniqueName: \"kubernetes.io/projected/32beb433-6bea-4bf9-b667-9cc7dd6c4444-kube-api-access-b8wpq\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.490294 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-config\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.591759 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-config\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.591937 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-dns-svc\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.591972 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-ovsdbserver-nb\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.592000 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8wpq\" (UniqueName: \"kubernetes.io/projected/32beb433-6bea-4bf9-b667-9cc7dd6c4444-kube-api-access-b8wpq\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.593762 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-config\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.596346 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-dns-svc\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.597336 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-ovsdbserver-nb\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.638827 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8wpq\" (UniqueName: \"kubernetes.io/projected/32beb433-6bea-4bf9-b667-9cc7dd6c4444-kube-api-access-b8wpq\") pod \"dnsmasq-dns-649dc66575-d5m6l\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.725121 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.826696 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.897052 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc5745ac-0808-4b1c-a447-9597236492b0-config-volume\") pod \"cc5745ac-0808-4b1c-a447-9597236492b0\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.897162 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc5745ac-0808-4b1c-a447-9597236492b0-secret-volume\") pod \"cc5745ac-0808-4b1c-a447-9597236492b0\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.897362 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhnfc\" (UniqueName: \"kubernetes.io/projected/cc5745ac-0808-4b1c-a447-9597236492b0-kube-api-access-hhnfc\") pod \"cc5745ac-0808-4b1c-a447-9597236492b0\" (UID: \"cc5745ac-0808-4b1c-a447-9597236492b0\") " Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.898071 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc5745ac-0808-4b1c-a447-9597236492b0-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc5745ac-0808-4b1c-a447-9597236492b0" (UID: "cc5745ac-0808-4b1c-a447-9597236492b0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.902655 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc5745ac-0808-4b1c-a447-9597236492b0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cc5745ac-0808-4b1c-a447-9597236492b0" (UID: "cc5745ac-0808-4b1c-a447-9597236492b0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.913654 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc5745ac-0808-4b1c-a447-9597236492b0-kube-api-access-hhnfc" (OuterVolumeSpecName: "kube-api-access-hhnfc") pod "cc5745ac-0808-4b1c-a447-9597236492b0" (UID: "cc5745ac-0808-4b1c-a447-9597236492b0"). InnerVolumeSpecName "kube-api-access-hhnfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.999394 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhnfc\" (UniqueName: \"kubernetes.io/projected/cc5745ac-0808-4b1c-a447-9597236492b0-kube-api-access-hhnfc\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.999441 5008 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc5745ac-0808-4b1c-a447-9597236492b0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:02 crc kubenswrapper[5008]: I0318 19:30:02.999454 5008 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc5745ac-0808-4b1c-a447-9597236492b0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.188572 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.193977 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649dc66575-d5m6l"] Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.418646 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.432077 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.457115 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.500344 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.500412 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564370-zpbkr" event={"ID":"cc5745ac-0808-4b1c-a447-9597236492b0","Type":"ContainerDied","Data":"c8180e67c479e5049bc2eec507e3447121483cede84cf7d8e9a1d98ea91ba4be"} Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.500470 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8180e67c479e5049bc2eec507e3447121483cede84cf7d8e9a1d98ea91ba4be" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.501522 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.503108 5008 generic.go:334] "Generic (PLEG): container finished" podID="1be5fd66-baab-4624-a51e-17d49300c05a" containerID="10cc64796f9e634bcb7ab15134249efbcf1ebcdaa30ec7bc541caa2258fa378c" exitCode=0 Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.503221 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564370-d58j9" event={"ID":"1be5fd66-baab-4624-a51e-17d49300c05a","Type":"ContainerDied","Data":"10cc64796f9e634bcb7ab15134249efbcf1ebcdaa30ec7bc541caa2258fa378c"} Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.506089 5008 generic.go:334] "Generic (PLEG): container finished" podID="32beb433-6bea-4bf9-b667-9cc7dd6c4444" containerID="df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881" exitCode=0 Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.507840 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" event={"ID":"32beb433-6bea-4bf9-b667-9cc7dd6c4444","Type":"ContainerDied","Data":"df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881"} Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.507871 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" event={"ID":"32beb433-6bea-4bf9-b667-9cc7dd6c4444","Type":"ContainerStarted","Data":"bdccc1833d2b313ccac5d332ff30fd8d5c5738888072355f8b1bede8161160bf"} Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.576760 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.640837 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.740779 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649dc66575-d5m6l"] Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.779678 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79d747d849-gfrl7"] Mar 18 19:30:03 crc kubenswrapper[5008]: E0318 19:30:03.780321 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5745ac-0808-4b1c-a447-9597236492b0" containerName="collect-profiles" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.780422 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5745ac-0808-4b1c-a447-9597236492b0" containerName="collect-profiles" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.780660 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc5745ac-0808-4b1c-a447-9597236492b0" containerName="collect-profiles" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.781469 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.793872 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.798927 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79d747d849-gfrl7"] Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.914497 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx"] Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.917441 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-dns-svc\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.917506 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkd6b\" (UniqueName: \"kubernetes.io/projected/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-kube-api-access-nkd6b\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.917546 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-config\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.917757 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-ovsdbserver-sb\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.917940 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-ovsdbserver-nb\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:03 crc kubenswrapper[5008]: I0318 19:30:03.926295 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564325-zjpkx"] Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.019740 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-dns-svc\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.019809 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkd6b\" (UniqueName: \"kubernetes.io/projected/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-kube-api-access-nkd6b\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.019834 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-config\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.019874 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-ovsdbserver-sb\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.019906 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-ovsdbserver-nb\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.020943 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-ovsdbserver-nb\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.021153 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-ovsdbserver-sb\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.021193 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-dns-svc\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.021718 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-config\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.060888 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkd6b\" (UniqueName: \"kubernetes.io/projected/2b0f0d45-3f1b-478e-9f72-767efaeb3b10-kube-api-access-nkd6b\") pod \"dnsmasq-dns-79d747d849-gfrl7\" (UID: \"2b0f0d45-3f1b-478e-9f72-767efaeb3b10\") " pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.108015 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.208023 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9175af-b974-4766-9c81-ffe6765a2099" path="/var/lib/kubelet/pods/eb9175af-b974-4766-9c81-ffe6765a2099/volumes" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.513988 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" event={"ID":"32beb433-6bea-4bf9-b667-9cc7dd6c4444","Type":"ContainerStarted","Data":"bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5"} Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.514354 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" podUID="32beb433-6bea-4bf9-b667-9cc7dd6c4444" containerName="dnsmasq-dns" containerID="cri-o://bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5" gracePeriod=10 Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.539519 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" podStartSLOduration=2.539452651 podStartE2EDuration="2.539452651s" podCreationTimestamp="2026-03-18 19:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:30:04.537309775 +0000 UTC m=+5261.056782874" watchObservedRunningTime="2026-03-18 19:30:04.539452651 +0000 UTC m=+5261.058925730" Mar 18 19:30:04 crc kubenswrapper[5008]: W0318 19:30:04.695323 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b0f0d45_3f1b_478e_9f72_767efaeb3b10.slice/crio-83a961208e00f4fc51b515b7ef801f2105455e52b9377432e88553a98e86b018 WatchSource:0}: Error finding container 83a961208e00f4fc51b515b7ef801f2105455e52b9377432e88553a98e86b018: Status 404 returned error can't find the container with id 83a961208e00f4fc51b515b7ef801f2105455e52b9377432e88553a98e86b018 Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.708759 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79d747d849-gfrl7"] Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.841784 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564370-d58j9" Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.948280 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqkrc\" (UniqueName: \"kubernetes.io/projected/1be5fd66-baab-4624-a51e-17d49300c05a-kube-api-access-nqkrc\") pod \"1be5fd66-baab-4624-a51e-17d49300c05a\" (UID: \"1be5fd66-baab-4624-a51e-17d49300c05a\") " Mar 18 19:30:04 crc kubenswrapper[5008]: I0318 19:30:04.953863 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be5fd66-baab-4624-a51e-17d49300c05a-kube-api-access-nqkrc" (OuterVolumeSpecName: "kube-api-access-nqkrc") pod "1be5fd66-baab-4624-a51e-17d49300c05a" (UID: "1be5fd66-baab-4624-a51e-17d49300c05a"). InnerVolumeSpecName "kube-api-access-nqkrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.004737 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.050442 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqkrc\" (UniqueName: \"kubernetes.io/projected/1be5fd66-baab-4624-a51e-17d49300c05a-kube-api-access-nqkrc\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.151806 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-dns-svc\") pod \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.151999 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8wpq\" (UniqueName: \"kubernetes.io/projected/32beb433-6bea-4bf9-b667-9cc7dd6c4444-kube-api-access-b8wpq\") pod \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.152185 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-config\") pod \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.152387 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-ovsdbserver-nb\") pod \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\" (UID: \"32beb433-6bea-4bf9-b667-9cc7dd6c4444\") " Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.155286 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32beb433-6bea-4bf9-b667-9cc7dd6c4444-kube-api-access-b8wpq" (OuterVolumeSpecName: "kube-api-access-b8wpq") pod "32beb433-6bea-4bf9-b667-9cc7dd6c4444" (UID: "32beb433-6bea-4bf9-b667-9cc7dd6c4444"). InnerVolumeSpecName "kube-api-access-b8wpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.191441 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32beb433-6bea-4bf9-b667-9cc7dd6c4444" (UID: "32beb433-6bea-4bf9-b667-9cc7dd6c4444"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.193096 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32beb433-6bea-4bf9-b667-9cc7dd6c4444" (UID: "32beb433-6bea-4bf9-b667-9cc7dd6c4444"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.214238 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-config" (OuterVolumeSpecName: "config") pod "32beb433-6bea-4bf9-b667-9cc7dd6c4444" (UID: "32beb433-6bea-4bf9-b667-9cc7dd6c4444"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.254996 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.255056 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8wpq\" (UniqueName: \"kubernetes.io/projected/32beb433-6bea-4bf9-b667-9cc7dd6c4444-kube-api-access-b8wpq\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.255081 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.255099 5008 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32beb433-6bea-4bf9-b667-9cc7dd6c4444-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.533366 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564370-d58j9" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.533377 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564370-d58j9" event={"ID":"1be5fd66-baab-4624-a51e-17d49300c05a","Type":"ContainerDied","Data":"ae445ba7fa070e20cfc29005df1ebebbe7c177f65eb01ad474cdcaad07aa03a9"} Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.533411 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae445ba7fa070e20cfc29005df1ebebbe7c177f65eb01ad474cdcaad07aa03a9" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.537108 5008 generic.go:334] "Generic (PLEG): container finished" podID="2b0f0d45-3f1b-478e-9f72-767efaeb3b10" containerID="7e3851aec74ff4dd4122f8e10660335e8644be676a0ef83d9496b437b2adbf8a" exitCode=0 Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.537232 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d747d849-gfrl7" event={"ID":"2b0f0d45-3f1b-478e-9f72-767efaeb3b10","Type":"ContainerDied","Data":"7e3851aec74ff4dd4122f8e10660335e8644be676a0ef83d9496b437b2adbf8a"} Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.537286 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d747d849-gfrl7" event={"ID":"2b0f0d45-3f1b-478e-9f72-767efaeb3b10","Type":"ContainerStarted","Data":"83a961208e00f4fc51b515b7ef801f2105455e52b9377432e88553a98e86b018"} Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.542263 5008 generic.go:334] "Generic (PLEG): container finished" podID="32beb433-6bea-4bf9-b667-9cc7dd6c4444" containerID="bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5" exitCode=0 Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.542320 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" event={"ID":"32beb433-6bea-4bf9-b667-9cc7dd6c4444","Type":"ContainerDied","Data":"bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5"} Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.542354 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" event={"ID":"32beb433-6bea-4bf9-b667-9cc7dd6c4444","Type":"ContainerDied","Data":"bdccc1833d2b313ccac5d332ff30fd8d5c5738888072355f8b1bede8161160bf"} Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.542376 5008 scope.go:117] "RemoveContainer" containerID="bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.542406 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649dc66575-d5m6l" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.590872 5008 scope.go:117] "RemoveContainer" containerID="df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.599813 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649dc66575-d5m6l"] Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.605886 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-649dc66575-d5m6l"] Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.629857 5008 scope.go:117] "RemoveContainer" containerID="bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5" Mar 18 19:30:05 crc kubenswrapper[5008]: E0318 19:30:05.630473 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5\": container with ID starting with bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5 not found: ID does not exist" containerID="bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.630539 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5"} err="failed to get container status \"bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5\": rpc error: code = NotFound desc = could not find container \"bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5\": container with ID starting with bcfe0f4ae3202cb401dde060a859d1b66a8f9b57d635f1737efd3db699e45ca5 not found: ID does not exist" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.630641 5008 scope.go:117] "RemoveContainer" containerID="df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881" Mar 18 19:30:05 crc kubenswrapper[5008]: E0318 19:30:05.631055 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881\": container with ID starting with df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881 not found: ID does not exist" containerID="df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.631087 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881"} err="failed to get container status \"df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881\": rpc error: code = NotFound desc = could not find container \"df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881\": container with ID starting with df452718d486eb042c7e4c826570640e823caf5622d1b8d50a8f394f79ac6881 not found: ID does not exist" Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.898757 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564364-9m4dm"] Mar 18 19:30:05 crc kubenswrapper[5008]: I0318 19:30:05.904497 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564364-9m4dm"] Mar 18 19:30:06 crc kubenswrapper[5008]: I0318 19:30:06.209152 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06448434-1769-4027-a455-8fcc1ddff0f4" path="/var/lib/kubelet/pods/06448434-1769-4027-a455-8fcc1ddff0f4/volumes" Mar 18 19:30:06 crc kubenswrapper[5008]: I0318 19:30:06.210727 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32beb433-6bea-4bf9-b667-9cc7dd6c4444" path="/var/lib/kubelet/pods/32beb433-6bea-4bf9-b667-9cc7dd6c4444/volumes" Mar 18 19:30:06 crc kubenswrapper[5008]: I0318 19:30:06.561426 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d747d849-gfrl7" event={"ID":"2b0f0d45-3f1b-478e-9f72-767efaeb3b10","Type":"ContainerStarted","Data":"fa8c1453ec982b7fe0f885acfe9f7ae73c24942841bdeab79d67a40a4bd12b94"} Mar 18 19:30:06 crc kubenswrapper[5008]: I0318 19:30:06.563224 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:07 crc kubenswrapper[5008]: I0318 19:30:07.454378 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 18 19:30:07 crc kubenswrapper[5008]: I0318 19:30:07.483674 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79d747d849-gfrl7" podStartSLOduration=4.483652957 podStartE2EDuration="4.483652957s" podCreationTimestamp="2026-03-18 19:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:30:06.62042665 +0000 UTC m=+5263.139899729" watchObservedRunningTime="2026-03-18 19:30:07.483652957 +0000 UTC m=+5264.003126046" Mar 18 19:30:09 crc kubenswrapper[5008]: I0318 19:30:09.979334 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 18 19:30:09 crc kubenswrapper[5008]: E0318 19:30:09.980087 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32beb433-6bea-4bf9-b667-9cc7dd6c4444" containerName="init" Mar 18 19:30:09 crc kubenswrapper[5008]: I0318 19:30:09.980106 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="32beb433-6bea-4bf9-b667-9cc7dd6c4444" containerName="init" Mar 18 19:30:09 crc kubenswrapper[5008]: E0318 19:30:09.980158 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32beb433-6bea-4bf9-b667-9cc7dd6c4444" containerName="dnsmasq-dns" Mar 18 19:30:09 crc kubenswrapper[5008]: I0318 19:30:09.980166 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="32beb433-6bea-4bf9-b667-9cc7dd6c4444" containerName="dnsmasq-dns" Mar 18 19:30:09 crc kubenswrapper[5008]: E0318 19:30:09.980181 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be5fd66-baab-4624-a51e-17d49300c05a" containerName="oc" Mar 18 19:30:09 crc kubenswrapper[5008]: I0318 19:30:09.980189 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be5fd66-baab-4624-a51e-17d49300c05a" containerName="oc" Mar 18 19:30:09 crc kubenswrapper[5008]: I0318 19:30:09.980373 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="32beb433-6bea-4bf9-b667-9cc7dd6c4444" containerName="dnsmasq-dns" Mar 18 19:30:09 crc kubenswrapper[5008]: I0318 19:30:09.980398 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be5fd66-baab-4624-a51e-17d49300c05a" containerName="oc" Mar 18 19:30:09 crc kubenswrapper[5008]: I0318 19:30:09.981149 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 19:30:09 crc kubenswrapper[5008]: I0318 19:30:09.984671 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 18 19:30:09 crc kubenswrapper[5008]: I0318 19:30:09.995693 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.045165 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-793ab1a9-2a4c-44eb-be0e-a4420d3c22bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-793ab1a9-2a4c-44eb-be0e-a4420d3c22bb\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") " pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.045251 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/bcd4e301-5199-4e63-b07e-4949232a96d2-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") " pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.045340 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztxwj\" (UniqueName: \"kubernetes.io/projected/bcd4e301-5199-4e63-b07e-4949232a96d2-kube-api-access-ztxwj\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") " pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.147514 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/bcd4e301-5199-4e63-b07e-4949232a96d2-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") " pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.147708 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztxwj\" (UniqueName: \"kubernetes.io/projected/bcd4e301-5199-4e63-b07e-4949232a96d2-kube-api-access-ztxwj\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") " pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.147806 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-793ab1a9-2a4c-44eb-be0e-a4420d3c22bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-793ab1a9-2a4c-44eb-be0e-a4420d3c22bb\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") " pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.151158 5008 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.151397 5008 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-793ab1a9-2a4c-44eb-be0e-a4420d3c22bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-793ab1a9-2a4c-44eb-be0e-a4420d3c22bb\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/881d3822a84420f12473e7a3e5db343ebbdd060ed9e348e95059d8c1828b038b/globalmount\"" pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.152964 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/bcd4e301-5199-4e63-b07e-4949232a96d2-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") " pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.175214 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztxwj\" (UniqueName: \"kubernetes.io/projected/bcd4e301-5199-4e63-b07e-4949232a96d2-kube-api-access-ztxwj\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") " pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.189130 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-793ab1a9-2a4c-44eb-be0e-a4420d3c22bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-793ab1a9-2a4c-44eb-be0e-a4420d3c22bb\") pod \"ovn-copy-data\" (UID: \"bcd4e301-5199-4e63-b07e-4949232a96d2\") " pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.198527 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:30:10 crc kubenswrapper[5008]: E0318 19:30:10.198951 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.315145 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 19:30:10 crc kubenswrapper[5008]: I0318 19:30:10.817611 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 19:30:11 crc kubenswrapper[5008]: I0318 19:30:11.599201 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"bcd4e301-5199-4e63-b07e-4949232a96d2","Type":"ContainerStarted","Data":"eb68b1f1e9ddaf745bc7f486f12fb84deb4aaf3853c34a599e2cb255e20dda52"} Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.110642 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79d747d849-gfrl7" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.188454 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-v8j2b"] Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.188832 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" podUID="190656d4-52a0-4b74-9472-d74daeb2d2be" containerName="dnsmasq-dns" containerID="cri-o://133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500" gracePeriod=10 Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.621589 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"bcd4e301-5199-4e63-b07e-4949232a96d2","Type":"ContainerStarted","Data":"bfba806ce85dc66b5bd547ee7b00f53ce0a2d987f6434241f4b1c7d971fa892d"} Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.624611 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.625047 5008 generic.go:334] "Generic (PLEG): container finished" podID="190656d4-52a0-4b74-9472-d74daeb2d2be" containerID="133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500" exitCode=0 Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.625081 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" event={"ID":"190656d4-52a0-4b74-9472-d74daeb2d2be","Type":"ContainerDied","Data":"133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500"} Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.625102 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" event={"ID":"190656d4-52a0-4b74-9472-d74daeb2d2be","Type":"ContainerDied","Data":"e077ce6e1f6a6e08b776407c49e8e938752d354b6c74f0d6af105e56c1ef013f"} Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.625120 5008 scope.go:117] "RemoveContainer" containerID="133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.639136 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.641658712 podStartE2EDuration="6.639115815s" podCreationTimestamp="2026-03-18 19:30:08 +0000 UTC" firstStartedPulling="2026-03-18 19:30:10.818320942 +0000 UTC m=+5267.337794021" lastFinishedPulling="2026-03-18 19:30:13.815778035 +0000 UTC m=+5270.335251124" observedRunningTime="2026-03-18 19:30:14.634960906 +0000 UTC m=+5271.154433985" watchObservedRunningTime="2026-03-18 19:30:14.639115815 +0000 UTC m=+5271.158588894" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.643469 5008 scope.go:117] "RemoveContainer" containerID="5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.668573 5008 scope.go:117] "RemoveContainer" containerID="133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500" Mar 18 19:30:14 crc kubenswrapper[5008]: E0318 19:30:14.669684 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500\": container with ID starting with 133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500 not found: ID does not exist" containerID="133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.669731 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500"} err="failed to get container status \"133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500\": rpc error: code = NotFound desc = could not find container \"133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500\": container with ID starting with 133e68700e437bbf2545fcbbf4120be1e1fe88222d0353e6b4689cd616eff500 not found: ID does not exist" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.669775 5008 scope.go:117] "RemoveContainer" containerID="5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d" Mar 18 19:30:14 crc kubenswrapper[5008]: E0318 19:30:14.671044 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d\": container with ID starting with 5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d not found: ID does not exist" containerID="5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.671082 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d"} err="failed to get container status \"5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d\": rpc error: code = NotFound desc = could not find container \"5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d\": container with ID starting with 5b358adaa5b8af278a337f2f32328b20a69b8b793a2772cc07f1a4babc69d97d not found: ID does not exist" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.790730 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-config\") pod \"190656d4-52a0-4b74-9472-d74daeb2d2be\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.790852 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pqgn\" (UniqueName: \"kubernetes.io/projected/190656d4-52a0-4b74-9472-d74daeb2d2be-kube-api-access-4pqgn\") pod \"190656d4-52a0-4b74-9472-d74daeb2d2be\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.790935 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-dns-svc\") pod \"190656d4-52a0-4b74-9472-d74daeb2d2be\" (UID: \"190656d4-52a0-4b74-9472-d74daeb2d2be\") " Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.796651 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190656d4-52a0-4b74-9472-d74daeb2d2be-kube-api-access-4pqgn" (OuterVolumeSpecName: "kube-api-access-4pqgn") pod "190656d4-52a0-4b74-9472-d74daeb2d2be" (UID: "190656d4-52a0-4b74-9472-d74daeb2d2be"). InnerVolumeSpecName "kube-api-access-4pqgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.835253 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-config" (OuterVolumeSpecName: "config") pod "190656d4-52a0-4b74-9472-d74daeb2d2be" (UID: "190656d4-52a0-4b74-9472-d74daeb2d2be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.836354 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "190656d4-52a0-4b74-9472-d74daeb2d2be" (UID: "190656d4-52a0-4b74-9472-d74daeb2d2be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.893001 5008 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-config\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.893038 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pqgn\" (UniqueName: \"kubernetes.io/projected/190656d4-52a0-4b74-9472-d74daeb2d2be-kube-api-access-4pqgn\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:14 crc kubenswrapper[5008]: I0318 19:30:14.893051 5008 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/190656d4-52a0-4b74-9472-d74daeb2d2be-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 19:30:15 crc kubenswrapper[5008]: I0318 19:30:15.637035 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-v8j2b" Mar 18 19:30:15 crc kubenswrapper[5008]: I0318 19:30:15.691430 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-v8j2b"] Mar 18 19:30:15 crc kubenswrapper[5008]: I0318 19:30:15.704757 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-v8j2b"] Mar 18 19:30:16 crc kubenswrapper[5008]: I0318 19:30:16.207040 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190656d4-52a0-4b74-9472-d74daeb2d2be" path="/var/lib/kubelet/pods/190656d4-52a0-4b74-9472-d74daeb2d2be/volumes" Mar 18 19:30:19 crc kubenswrapper[5008]: I0318 19:30:19.959890 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 19:30:19 crc kubenswrapper[5008]: E0318 19:30:19.960442 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190656d4-52a0-4b74-9472-d74daeb2d2be" containerName="init" Mar 18 19:30:19 crc kubenswrapper[5008]: I0318 19:30:19.960458 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="190656d4-52a0-4b74-9472-d74daeb2d2be" containerName="init" Mar 18 19:30:19 crc kubenswrapper[5008]: E0318 19:30:19.960484 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190656d4-52a0-4b74-9472-d74daeb2d2be" containerName="dnsmasq-dns" Mar 18 19:30:19 crc kubenswrapper[5008]: I0318 19:30:19.960491 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="190656d4-52a0-4b74-9472-d74daeb2d2be" containerName="dnsmasq-dns" Mar 18 19:30:19 crc kubenswrapper[5008]: I0318 19:30:19.960660 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="190656d4-52a0-4b74-9472-d74daeb2d2be" containerName="dnsmasq-dns" Mar 18 19:30:19 crc kubenswrapper[5008]: I0318 19:30:19.961494 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 19:30:19 crc kubenswrapper[5008]: I0318 19:30:19.964100 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 19:30:19 crc kubenswrapper[5008]: I0318 19:30:19.965475 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5zzqq" Mar 18 19:30:19 crc kubenswrapper[5008]: I0318 19:30:19.966807 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.009676 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.076240 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ce66255-5329-4e59-a9e9-aee0f4b845e7-scripts\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.076305 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ce66255-5329-4e59-a9e9-aee0f4b845e7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.076418 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce66255-5329-4e59-a9e9-aee0f4b845e7-config\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.076589 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bws8h\" (UniqueName: \"kubernetes.io/projected/0ce66255-5329-4e59-a9e9-aee0f4b845e7-kube-api-access-bws8h\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.076933 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce66255-5329-4e59-a9e9-aee0f4b845e7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.178301 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce66255-5329-4e59-a9e9-aee0f4b845e7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.178373 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ce66255-5329-4e59-a9e9-aee0f4b845e7-scripts\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.178412 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ce66255-5329-4e59-a9e9-aee0f4b845e7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.178436 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce66255-5329-4e59-a9e9-aee0f4b845e7-config\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.178464 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bws8h\" (UniqueName: \"kubernetes.io/projected/0ce66255-5329-4e59-a9e9-aee0f4b845e7-kube-api-access-bws8h\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.178957 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ce66255-5329-4e59-a9e9-aee0f4b845e7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.179313 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ce66255-5329-4e59-a9e9-aee0f4b845e7-scripts\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.179388 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce66255-5329-4e59-a9e9-aee0f4b845e7-config\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.185325 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce66255-5329-4e59-a9e9-aee0f4b845e7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.195161 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bws8h\" (UniqueName: \"kubernetes.io/projected/0ce66255-5329-4e59-a9e9-aee0f4b845e7-kube-api-access-bws8h\") pod \"ovn-northd-0\" (UID: \"0ce66255-5329-4e59-a9e9-aee0f4b845e7\") " pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.286169 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 19:30:20 crc kubenswrapper[5008]: W0318 19:30:20.763111 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ce66255_5329_4e59_a9e9_aee0f4b845e7.slice/crio-028ea66e67e0bcb82314de21dada4825a643c3ceb82053063fd8e355e7e9a323 WatchSource:0}: Error finding container 028ea66e67e0bcb82314de21dada4825a643c3ceb82053063fd8e355e7e9a323: Status 404 returned error can't find the container with id 028ea66e67e0bcb82314de21dada4825a643c3ceb82053063fd8e355e7e9a323 Mar 18 19:30:20 crc kubenswrapper[5008]: I0318 19:30:20.763282 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 19:30:21 crc kubenswrapper[5008]: I0318 19:30:21.688859 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0ce66255-5329-4e59-a9e9-aee0f4b845e7","Type":"ContainerStarted","Data":"732b9860325f24efe98f177588239ea33414618d6e306e35c910c74125a27f8a"} Mar 18 19:30:21 crc kubenswrapper[5008]: I0318 19:30:21.689404 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0ce66255-5329-4e59-a9e9-aee0f4b845e7","Type":"ContainerStarted","Data":"e01279bf5451cbfcf703e731aacc9bb6f1bdb88a0cf20f8fd96d0cf6929b1b1e"} Mar 18 19:30:21 crc kubenswrapper[5008]: I0318 19:30:21.689425 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 19:30:21 crc kubenswrapper[5008]: I0318 19:30:21.689438 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0ce66255-5329-4e59-a9e9-aee0f4b845e7","Type":"ContainerStarted","Data":"028ea66e67e0bcb82314de21dada4825a643c3ceb82053063fd8e355e7e9a323"} Mar 18 19:30:21 crc kubenswrapper[5008]: I0318 19:30:21.718648 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.718619802 podStartE2EDuration="2.718619802s" podCreationTimestamp="2026-03-18 19:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 19:30:21.704527612 +0000 UTC m=+5278.224000691" watchObservedRunningTime="2026-03-18 19:30:21.718619802 +0000 UTC m=+5278.238092911" Mar 18 19:30:25 crc kubenswrapper[5008]: I0318 19:30:25.198410 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:30:25 crc kubenswrapper[5008]: E0318 19:30:25.198947 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:30:40 crc kubenswrapper[5008]: I0318 19:30:40.199694 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:30:40 crc kubenswrapper[5008]: E0318 19:30:40.200884 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:30:40 crc kubenswrapper[5008]: I0318 19:30:40.391719 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 19:30:44 crc kubenswrapper[5008]: I0318 19:30:44.916788 5008 scope.go:117] "RemoveContainer" containerID="693e20e09e839bab6b160169d6ebb10e485bf25efd6f9e8784901c4b8a82398f" Mar 18 19:30:44 crc kubenswrapper[5008]: I0318 19:30:44.984164 5008 scope.go:117] "RemoveContainer" containerID="e2d0ef204a166e30edc9dd4e618f6321828bb1606f9e2a5163376d95def2ff6c" Mar 18 19:30:54 crc kubenswrapper[5008]: I0318 19:30:54.209427 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:30:54 crc kubenswrapper[5008]: E0318 19:30:54.211548 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:31:06 crc kubenswrapper[5008]: I0318 19:31:06.198167 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:31:07 crc kubenswrapper[5008]: I0318 19:31:07.137620 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"4ed354de23509ced81330dce6064f0e54cdb4d741d3f848d673994d0c464ccba"} Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.216979 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vnkpt/must-gather-5trpz"] Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.218862 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.220856 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vnkpt"/"default-dockercfg-qwff6" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.220953 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vnkpt"/"kube-root-ca.crt" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.221987 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vnkpt"/"openshift-service-ca.crt" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.224719 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vnkpt/must-gather-5trpz"] Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.351423 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmdgk\" (UniqueName: \"kubernetes.io/projected/8c131199-f3fc-473d-be6f-7f3f42b8c542-kube-api-access-nmdgk\") pod \"must-gather-5trpz\" (UID: \"8c131199-f3fc-473d-be6f-7f3f42b8c542\") " pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.351602 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8c131199-f3fc-473d-be6f-7f3f42b8c542-must-gather-output\") pod \"must-gather-5trpz\" (UID: \"8c131199-f3fc-473d-be6f-7f3f42b8c542\") " pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.453125 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmdgk\" (UniqueName: \"kubernetes.io/projected/8c131199-f3fc-473d-be6f-7f3f42b8c542-kube-api-access-nmdgk\") pod \"must-gather-5trpz\" (UID: \"8c131199-f3fc-473d-be6f-7f3f42b8c542\") " pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.453252 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8c131199-f3fc-473d-be6f-7f3f42b8c542-must-gather-output\") pod \"must-gather-5trpz\" (UID: \"8c131199-f3fc-473d-be6f-7f3f42b8c542\") " pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.454170 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8c131199-f3fc-473d-be6f-7f3f42b8c542-must-gather-output\") pod \"must-gather-5trpz\" (UID: \"8c131199-f3fc-473d-be6f-7f3f42b8c542\") " pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.472064 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmdgk\" (UniqueName: \"kubernetes.io/projected/8c131199-f3fc-473d-be6f-7f3f42b8c542-kube-api-access-nmdgk\") pod \"must-gather-5trpz\" (UID: \"8c131199-f3fc-473d-be6f-7f3f42b8c542\") " pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.538312 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:31:19 crc kubenswrapper[5008]: I0318 19:31:19.987523 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vnkpt/must-gather-5trpz"] Mar 18 19:31:20 crc kubenswrapper[5008]: I0318 19:31:20.264539 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnkpt/must-gather-5trpz" event={"ID":"8c131199-f3fc-473d-be6f-7f3f42b8c542","Type":"ContainerStarted","Data":"90f2a649676901b479ac9ed58064e9c1ae72602d16fde7ec745051795af88068"} Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.314763 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnkpt/must-gather-5trpz" event={"ID":"8c131199-f3fc-473d-be6f-7f3f42b8c542","Type":"ContainerStarted","Data":"708f496f9ab39dcd85911634502f4126814721f0e02eea2583fabea7b70db17d"} Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.315310 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnkpt/must-gather-5trpz" event={"ID":"8c131199-f3fc-473d-be6f-7f3f42b8c542","Type":"ContainerStarted","Data":"357cd3737a0fc9c7dbe2d52cefb39b0b126e017157debbb33ce559623517edac"} Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.335269 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vnkpt/must-gather-5trpz" podStartSLOduration=1.466656819 podStartE2EDuration="7.335245651s" podCreationTimestamp="2026-03-18 19:31:19 +0000 UTC" firstStartedPulling="2026-03-18 19:31:19.99499235 +0000 UTC m=+5336.514465429" lastFinishedPulling="2026-03-18 19:31:25.863581182 +0000 UTC m=+5342.383054261" observedRunningTime="2026-03-18 19:31:26.329656004 +0000 UTC m=+5342.849129093" watchObservedRunningTime="2026-03-18 19:31:26.335245651 +0000 UTC m=+5342.854718730" Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.707539 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vnkpt/crc-debug-hvl6v"] Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.708942 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.781912 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03f8e795-1b0a-4c14-8b7e-b389a6766cff-host\") pod \"crc-debug-hvl6v\" (UID: \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\") " pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.782183 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstfr\" (UniqueName: \"kubernetes.io/projected/03f8e795-1b0a-4c14-8b7e-b389a6766cff-kube-api-access-jstfr\") pod \"crc-debug-hvl6v\" (UID: \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\") " pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.883379 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03f8e795-1b0a-4c14-8b7e-b389a6766cff-host\") pod \"crc-debug-hvl6v\" (UID: \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\") " pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.883692 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jstfr\" (UniqueName: \"kubernetes.io/projected/03f8e795-1b0a-4c14-8b7e-b389a6766cff-kube-api-access-jstfr\") pod \"crc-debug-hvl6v\" (UID: \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\") " pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.883537 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03f8e795-1b0a-4c14-8b7e-b389a6766cff-host\") pod \"crc-debug-hvl6v\" (UID: \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\") " pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:31:26 crc kubenswrapper[5008]: I0318 19:31:26.906918 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstfr\" (UniqueName: \"kubernetes.io/projected/03f8e795-1b0a-4c14-8b7e-b389a6766cff-kube-api-access-jstfr\") pod \"crc-debug-hvl6v\" (UID: \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\") " pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:31:27 crc kubenswrapper[5008]: I0318 19:31:27.024079 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:31:27 crc kubenswrapper[5008]: W0318 19:31:27.047544 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03f8e795_1b0a_4c14_8b7e_b389a6766cff.slice/crio-8326ae4fbf049c94feec1f50c172b3ebf164d8f7c8b4d72d3606119d29dad4f7 WatchSource:0}: Error finding container 8326ae4fbf049c94feec1f50c172b3ebf164d8f7c8b4d72d3606119d29dad4f7: Status 404 returned error can't find the container with id 8326ae4fbf049c94feec1f50c172b3ebf164d8f7c8b4d72d3606119d29dad4f7 Mar 18 19:31:27 crc kubenswrapper[5008]: I0318 19:31:27.344865 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" event={"ID":"03f8e795-1b0a-4c14-8b7e-b389a6766cff","Type":"ContainerStarted","Data":"8326ae4fbf049c94feec1f50c172b3ebf164d8f7c8b4d72d3606119d29dad4f7"} Mar 18 19:31:38 crc kubenswrapper[5008]: I0318 19:31:38.426265 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" event={"ID":"03f8e795-1b0a-4c14-8b7e-b389a6766cff","Type":"ContainerStarted","Data":"b9f1a321ec3bfc10d5c6a2d5aaa5c93e4aee395ee3cf5c0d21d0b94b9099284b"} Mar 18 19:31:38 crc kubenswrapper[5008]: I0318 19:31:38.444197 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" podStartSLOduration=1.995528519 podStartE2EDuration="12.444175465s" podCreationTimestamp="2026-03-18 19:31:26 +0000 UTC" firstStartedPulling="2026-03-18 19:31:27.049745058 +0000 UTC m=+5343.569218137" lastFinishedPulling="2026-03-18 19:31:37.498391994 +0000 UTC m=+5354.017865083" observedRunningTime="2026-03-18 19:31:38.440821747 +0000 UTC m=+5354.960294816" watchObservedRunningTime="2026-03-18 19:31:38.444175465 +0000 UTC m=+5354.963648544" Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.153243 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564372-p5twb"] Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.155086 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564372-p5twb" Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.158011 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.158342 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.158526 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.163549 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564372-p5twb"] Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.240591 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5f5j\" (UniqueName: \"kubernetes.io/projected/d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b-kube-api-access-h5f5j\") pod \"auto-csr-approver-29564372-p5twb\" (UID: \"d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b\") " pod="openshift-infra/auto-csr-approver-29564372-p5twb" Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.343595 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5f5j\" (UniqueName: \"kubernetes.io/projected/d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b-kube-api-access-h5f5j\") pod \"auto-csr-approver-29564372-p5twb\" (UID: \"d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b\") " pod="openshift-infra/auto-csr-approver-29564372-p5twb" Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.365145 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5f5j\" (UniqueName: \"kubernetes.io/projected/d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b-kube-api-access-h5f5j\") pod \"auto-csr-approver-29564372-p5twb\" (UID: \"d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b\") " pod="openshift-infra/auto-csr-approver-29564372-p5twb" Mar 18 19:32:00 crc kubenswrapper[5008]: I0318 19:32:00.854063 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564372-p5twb" Mar 18 19:32:01 crc kubenswrapper[5008]: I0318 19:32:01.981362 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564372-p5twb"] Mar 18 19:32:01 crc kubenswrapper[5008]: W0318 19:32:01.996858 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7edaa8c_5eb4_4df7_b896_5e9cfbabe91b.slice/crio-63c0c0ae399fdf5395fa3f3d5cefab7f57d5b73f7f111263119c66e3e0e092b0 WatchSource:0}: Error finding container 63c0c0ae399fdf5395fa3f3d5cefab7f57d5b73f7f111263119c66e3e0e092b0: Status 404 returned error can't find the container with id 63c0c0ae399fdf5395fa3f3d5cefab7f57d5b73f7f111263119c66e3e0e092b0 Mar 18 19:32:02 crc kubenswrapper[5008]: I0318 19:32:02.607315 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564372-p5twb" event={"ID":"d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b","Type":"ContainerStarted","Data":"63c0c0ae399fdf5395fa3f3d5cefab7f57d5b73f7f111263119c66e3e0e092b0"} Mar 18 19:32:03 crc kubenswrapper[5008]: I0318 19:32:03.616065 5008 generic.go:334] "Generic (PLEG): container finished" podID="03f8e795-1b0a-4c14-8b7e-b389a6766cff" containerID="b9f1a321ec3bfc10d5c6a2d5aaa5c93e4aee395ee3cf5c0d21d0b94b9099284b" exitCode=0 Mar 18 19:32:03 crc kubenswrapper[5008]: I0318 19:32:03.616197 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" event={"ID":"03f8e795-1b0a-4c14-8b7e-b389a6766cff","Type":"ContainerDied","Data":"b9f1a321ec3bfc10d5c6a2d5aaa5c93e4aee395ee3cf5c0d21d0b94b9099284b"} Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.624509 5008 generic.go:334] "Generic (PLEG): container finished" podID="d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b" containerID="8dfb5836be42de2aca024ad61a0813b002571a5405ccb7b267da3aa6e7dbc1ee" exitCode=0 Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.624592 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564372-p5twb" event={"ID":"d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b","Type":"ContainerDied","Data":"8dfb5836be42de2aca024ad61a0813b002571a5405ccb7b267da3aa6e7dbc1ee"} Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.703808 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.730679 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vnkpt/crc-debug-hvl6v"] Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.739353 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vnkpt/crc-debug-hvl6v"] Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.812665 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03f8e795-1b0a-4c14-8b7e-b389a6766cff-host\") pod \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\" (UID: \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\") " Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.812764 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jstfr\" (UniqueName: \"kubernetes.io/projected/03f8e795-1b0a-4c14-8b7e-b389a6766cff-kube-api-access-jstfr\") pod \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\" (UID: \"03f8e795-1b0a-4c14-8b7e-b389a6766cff\") " Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.812769 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03f8e795-1b0a-4c14-8b7e-b389a6766cff-host" (OuterVolumeSpecName: "host") pod "03f8e795-1b0a-4c14-8b7e-b389a6766cff" (UID: "03f8e795-1b0a-4c14-8b7e-b389a6766cff"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.813079 5008 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03f8e795-1b0a-4c14-8b7e-b389a6766cff-host\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.826842 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f8e795-1b0a-4c14-8b7e-b389a6766cff-kube-api-access-jstfr" (OuterVolumeSpecName: "kube-api-access-jstfr") pod "03f8e795-1b0a-4c14-8b7e-b389a6766cff" (UID: "03f8e795-1b0a-4c14-8b7e-b389a6766cff"). InnerVolumeSpecName "kube-api-access-jstfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:32:04 crc kubenswrapper[5008]: I0318 19:32:04.914837 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jstfr\" (UniqueName: \"kubernetes.io/projected/03f8e795-1b0a-4c14-8b7e-b389a6766cff-kube-api-access-jstfr\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:05 crc kubenswrapper[5008]: I0318 19:32:05.641737 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/crc-debug-hvl6v" Mar 18 19:32:05 crc kubenswrapper[5008]: I0318 19:32:05.642687 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8326ae4fbf049c94feec1f50c172b3ebf164d8f7c8b4d72d3606119d29dad4f7" Mar 18 19:32:05 crc kubenswrapper[5008]: I0318 19:32:05.964909 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vnkpt/crc-debug-t9wf6"] Mar 18 19:32:05 crc kubenswrapper[5008]: E0318 19:32:05.965650 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f8e795-1b0a-4c14-8b7e-b389a6766cff" containerName="container-00" Mar 18 19:32:05 crc kubenswrapper[5008]: I0318 19:32:05.965689 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f8e795-1b0a-4c14-8b7e-b389a6766cff" containerName="container-00" Mar 18 19:32:05 crc kubenswrapper[5008]: I0318 19:32:05.965886 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f8e795-1b0a-4c14-8b7e-b389a6766cff" containerName="container-00" Mar 18 19:32:05 crc kubenswrapper[5008]: I0318 19:32:05.966598 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.042681 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564372-p5twb" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.069712 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnx8z\" (UniqueName: \"kubernetes.io/projected/589022c5-86f2-4a77-8f2b-65bec7c5aa76-kube-api-access-tnx8z\") pod \"crc-debug-t9wf6\" (UID: \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\") " pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.069822 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/589022c5-86f2-4a77-8f2b-65bec7c5aa76-host\") pod \"crc-debug-t9wf6\" (UID: \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\") " pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.171205 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5f5j\" (UniqueName: \"kubernetes.io/projected/d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b-kube-api-access-h5f5j\") pod \"d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b\" (UID: \"d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b\") " Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.171637 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnx8z\" (UniqueName: \"kubernetes.io/projected/589022c5-86f2-4a77-8f2b-65bec7c5aa76-kube-api-access-tnx8z\") pod \"crc-debug-t9wf6\" (UID: \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\") " pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.171711 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/589022c5-86f2-4a77-8f2b-65bec7c5aa76-host\") pod \"crc-debug-t9wf6\" (UID: \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\") " pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.174835 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/589022c5-86f2-4a77-8f2b-65bec7c5aa76-host\") pod \"crc-debug-t9wf6\" (UID: \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\") " pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.189979 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b-kube-api-access-h5f5j" (OuterVolumeSpecName: "kube-api-access-h5f5j") pod "d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b" (UID: "d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b"). InnerVolumeSpecName "kube-api-access-h5f5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.192955 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnx8z\" (UniqueName: \"kubernetes.io/projected/589022c5-86f2-4a77-8f2b-65bec7c5aa76-kube-api-access-tnx8z\") pod \"crc-debug-t9wf6\" (UID: \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\") " pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.207049 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f8e795-1b0a-4c14-8b7e-b389a6766cff" path="/var/lib/kubelet/pods/03f8e795-1b0a-4c14-8b7e-b389a6766cff/volumes" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.279774 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5f5j\" (UniqueName: \"kubernetes.io/projected/d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b-kube-api-access-h5f5j\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.340257 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:06 crc kubenswrapper[5008]: W0318 19:32:06.369820 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod589022c5_86f2_4a77_8f2b_65bec7c5aa76.slice/crio-b15447132bb930a4e8805ec53373194c384cd383d32f2742ce776679d5db3aaf WatchSource:0}: Error finding container b15447132bb930a4e8805ec53373194c384cd383d32f2742ce776679d5db3aaf: Status 404 returned error can't find the container with id b15447132bb930a4e8805ec53373194c384cd383d32f2742ce776679d5db3aaf Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.653173 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564372-p5twb" event={"ID":"d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b","Type":"ContainerDied","Data":"63c0c0ae399fdf5395fa3f3d5cefab7f57d5b73f7f111263119c66e3e0e092b0"} Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.653524 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63c0c0ae399fdf5395fa3f3d5cefab7f57d5b73f7f111263119c66e3e0e092b0" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.653203 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564372-p5twb" Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.655040 5008 generic.go:334] "Generic (PLEG): container finished" podID="589022c5-86f2-4a77-8f2b-65bec7c5aa76" containerID="c76b2909698c56d1619f3f7902b3b5b301d0abf9d91ad002fcd0f3caeff55fd0" exitCode=1 Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.655097 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" event={"ID":"589022c5-86f2-4a77-8f2b-65bec7c5aa76","Type":"ContainerDied","Data":"c76b2909698c56d1619f3f7902b3b5b301d0abf9d91ad002fcd0f3caeff55fd0"} Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.655144 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" event={"ID":"589022c5-86f2-4a77-8f2b-65bec7c5aa76","Type":"ContainerStarted","Data":"b15447132bb930a4e8805ec53373194c384cd383d32f2742ce776679d5db3aaf"} Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.696001 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vnkpt/crc-debug-t9wf6"] Mar 18 19:32:06 crc kubenswrapper[5008]: I0318 19:32:06.701875 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vnkpt/crc-debug-t9wf6"] Mar 18 19:32:07 crc kubenswrapper[5008]: I0318 19:32:07.115698 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564366-jn9mc"] Mar 18 19:32:07 crc kubenswrapper[5008]: I0318 19:32:07.121272 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564366-jn9mc"] Mar 18 19:32:07 crc kubenswrapper[5008]: I0318 19:32:07.738804 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:07 crc kubenswrapper[5008]: I0318 19:32:07.808349 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/589022c5-86f2-4a77-8f2b-65bec7c5aa76-host\") pod \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\" (UID: \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\") " Mar 18 19:32:07 crc kubenswrapper[5008]: I0318 19:32:07.808973 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/589022c5-86f2-4a77-8f2b-65bec7c5aa76-host" (OuterVolumeSpecName: "host") pod "589022c5-86f2-4a77-8f2b-65bec7c5aa76" (UID: "589022c5-86f2-4a77-8f2b-65bec7c5aa76"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 19:32:07 crc kubenswrapper[5008]: I0318 19:32:07.909413 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnx8z\" (UniqueName: \"kubernetes.io/projected/589022c5-86f2-4a77-8f2b-65bec7c5aa76-kube-api-access-tnx8z\") pod \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\" (UID: \"589022c5-86f2-4a77-8f2b-65bec7c5aa76\") " Mar 18 19:32:07 crc kubenswrapper[5008]: I0318 19:32:07.909791 5008 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/589022c5-86f2-4a77-8f2b-65bec7c5aa76-host\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:07 crc kubenswrapper[5008]: I0318 19:32:07.928798 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/589022c5-86f2-4a77-8f2b-65bec7c5aa76-kube-api-access-tnx8z" (OuterVolumeSpecName: "kube-api-access-tnx8z") pod "589022c5-86f2-4a77-8f2b-65bec7c5aa76" (UID: "589022c5-86f2-4a77-8f2b-65bec7c5aa76"). InnerVolumeSpecName "kube-api-access-tnx8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:32:08 crc kubenswrapper[5008]: I0318 19:32:08.011018 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnx8z\" (UniqueName: \"kubernetes.io/projected/589022c5-86f2-4a77-8f2b-65bec7c5aa76-kube-api-access-tnx8z\") on node \"crc\" DevicePath \"\"" Mar 18 19:32:08 crc kubenswrapper[5008]: I0318 19:32:08.207532 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be5f5c0-c2ef-41d2-91ea-ebe8631b2566" path="/var/lib/kubelet/pods/1be5f5c0-c2ef-41d2-91ea-ebe8631b2566/volumes" Mar 18 19:32:08 crc kubenswrapper[5008]: I0318 19:32:08.208229 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="589022c5-86f2-4a77-8f2b-65bec7c5aa76" path="/var/lib/kubelet/pods/589022c5-86f2-4a77-8f2b-65bec7c5aa76/volumes" Mar 18 19:32:08 crc kubenswrapper[5008]: I0318 19:32:08.673449 5008 scope.go:117] "RemoveContainer" containerID="c76b2909698c56d1619f3f7902b3b5b301d0abf9d91ad002fcd0f3caeff55fd0" Mar 18 19:32:08 crc kubenswrapper[5008]: I0318 19:32:08.673504 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/crc-debug-t9wf6" Mar 18 19:32:17 crc kubenswrapper[5008]: I0318 19:32:17.416175 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79d747d849-gfrl7_2b0f0d45-3f1b-478e-9f72-767efaeb3b10/init/0.log" Mar 18 19:32:17 crc kubenswrapper[5008]: I0318 19:32:17.674163 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79d747d849-gfrl7_2b0f0d45-3f1b-478e-9f72-767efaeb3b10/init/0.log" Mar 18 19:32:17 crc kubenswrapper[5008]: I0318 19:32:17.704366 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79d747d849-gfrl7_2b0f0d45-3f1b-478e-9f72-767efaeb3b10/dnsmasq-dns/0.log" Mar 18 19:32:17 crc kubenswrapper[5008]: I0318 19:32:17.863499 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_5fa7fa02-ca96-44a4-b42b-9509fe7d6f14/adoption/0.log" Mar 18 19:32:17 crc kubenswrapper[5008]: I0318 19:32:17.916541 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ef72875b-b932-47d0-8bf9-ba4a63d47fb3/memcached/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.017964 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2eb4ee4c-a64a-4d36-9b6d-a3386cb30917/mysql-bootstrap/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.190720 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2eb4ee4c-a64a-4d36-9b6d-a3386cb30917/galera/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.194458 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2eb4ee4c-a64a-4d36-9b6d-a3386cb30917/mysql-bootstrap/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.246569 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_20b29007-4fe1-4277-adf3-5ad1fe710130/mysql-bootstrap/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.462231 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_20b29007-4fe1-4277-adf3-5ad1fe710130/galera/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.464660 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_20b29007-4fe1-4277-adf3-5ad1fe710130/mysql-bootstrap/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.500644 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_bcd4e301-5199-4e63-b07e-4949232a96d2/adoption/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.619861 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0ce66255-5329-4e59-a9e9-aee0f4b845e7/openstack-network-exporter/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.685916 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0ce66255-5329-4e59-a9e9-aee0f4b845e7/ovn-northd/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.774329 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a1fa7681-037d-42ad-bded-ab849fd5541b/openstack-network-exporter/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.849664 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a1fa7681-037d-42ad-bded-ab849fd5541b/ovsdbserver-nb/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.912419 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_3796e669-b541-4e2c-b876-006a019c5d9a/openstack-network-exporter/0.log" Mar 18 19:32:18 crc kubenswrapper[5008]: I0318 19:32:18.955213 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_3796e669-b541-4e2c-b876-006a019c5d9a/ovsdbserver-nb/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.045469 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3e8645c7-3653-4517-b1c9-2be63877356f/openstack-network-exporter/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.109775 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3e8645c7-3653-4517-b1c9-2be63877356f/ovsdbserver-nb/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.248033 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0aa59857-10dc-408d-9f91-3ed06b022f0c/openstack-network-exporter/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.282046 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0aa59857-10dc-408d-9f91-3ed06b022f0c/ovsdbserver-sb/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.359676 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_28bb0d68-09a7-4c21-a62f-b7f0687e22c4/openstack-network-exporter/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.412476 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_28bb0d68-09a7-4c21-a62f-b7f0687e22c4/ovsdbserver-sb/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.489260 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_1028aa1c-2139-49f0-8ed6-4187186bc1c9/openstack-network-exporter/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.591240 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_1028aa1c-2139-49f0-8ed6-4187186bc1c9/ovsdbserver-sb/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.651424 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4b353952-1871-46aa-b76a-a475bbc9fb42/setup-container/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.830913 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4b353952-1871-46aa-b76a-a475bbc9fb42/rabbitmq/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.837538 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4b353952-1871-46aa-b76a-a475bbc9fb42/setup-container/0.log" Mar 18 19:32:19 crc kubenswrapper[5008]: I0318 19:32:19.867075 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bf16152f-7363-4a9a-906b-237917d3e262/setup-container/0.log" Mar 18 19:32:20 crc kubenswrapper[5008]: I0318 19:32:20.045739 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bf16152f-7363-4a9a-906b-237917d3e262/setup-container/0.log" Mar 18 19:32:20 crc kubenswrapper[5008]: I0318 19:32:20.053235 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-q986z_35246fdb-28bc-4e43-af93-b4171a374147/mariadb-account-create-update/0.log" Mar 18 19:32:20 crc kubenswrapper[5008]: I0318 19:32:20.064261 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bf16152f-7363-4a9a-906b-237917d3e262/rabbitmq/0.log" Mar 18 19:32:35 crc kubenswrapper[5008]: I0318 19:32:35.233285 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z_b12d4842-a3e7-43d9-b1d2-29e90b8aa247/util/0.log" Mar 18 19:32:35 crc kubenswrapper[5008]: I0318 19:32:35.369469 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z_b12d4842-a3e7-43d9-b1d2-29e90b8aa247/util/0.log" Mar 18 19:32:35 crc kubenswrapper[5008]: I0318 19:32:35.393127 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z_b12d4842-a3e7-43d9-b1d2-29e90b8aa247/pull/0.log" Mar 18 19:32:35 crc kubenswrapper[5008]: I0318 19:32:35.431427 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z_b12d4842-a3e7-43d9-b1d2-29e90b8aa247/pull/0.log" Mar 18 19:32:35 crc kubenswrapper[5008]: I0318 19:32:35.607096 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z_b12d4842-a3e7-43d9-b1d2-29e90b8aa247/util/0.log" Mar 18 19:32:35 crc kubenswrapper[5008]: I0318 19:32:35.616487 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z_b12d4842-a3e7-43d9-b1d2-29e90b8aa247/extract/0.log" Mar 18 19:32:35 crc kubenswrapper[5008]: I0318 19:32:35.634631 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ccw64z_b12d4842-a3e7-43d9-b1d2-29e90b8aa247/pull/0.log" Mar 18 19:32:35 crc kubenswrapper[5008]: I0318 19:32:35.864671 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-cgg7x_852f1249-96d5-4768-b4a5-cba6a81a00a0/manager/0.log" Mar 18 19:32:36 crc kubenswrapper[5008]: I0318 19:32:36.298394 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-wn4dr_d0a49d36-fd45-4ff2-9bb9-f1ccfb048537/manager/0.log" Mar 18 19:32:36 crc kubenswrapper[5008]: I0318 19:32:36.422048 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-thfzw_0c6afee9-7e01-426d-beb5-7db66667228e/manager/0.log" Mar 18 19:32:36 crc kubenswrapper[5008]: I0318 19:32:36.483214 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-x9hn5_3c744c82-fc71-4e82-8d8c-4f43404a7664/manager/0.log" Mar 18 19:32:36 crc kubenswrapper[5008]: I0318 19:32:36.594823 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-stnqj_185ad843-7fab-4ae9-9e83-91c681a93f90/manager/0.log" Mar 18 19:32:36 crc kubenswrapper[5008]: I0318 19:32:36.904859 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-4ctxd_53e77882-f2cb-4be8-a8ca-4e9118f30a95/manager/0.log" Mar 18 19:32:37 crc kubenswrapper[5008]: I0318 19:32:37.152081 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-k9grx_b78718c8-6ad4-4bc1-ae6d-26bcfbddf493/manager/0.log" Mar 18 19:32:37 crc kubenswrapper[5008]: I0318 19:32:37.210951 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-7fvgk_7cb2d5ac-6334-4e36-9c90-8554ccd85c4f/manager/0.log" Mar 18 19:32:37 crc kubenswrapper[5008]: I0318 19:32:37.349931 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-lcvlv_17f04529-6a55-45ae-a999-888e15d5e9d8/manager/0.log" Mar 18 19:32:37 crc kubenswrapper[5008]: I0318 19:32:37.521193 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-thmph_fdaac0ac-492e-438a-b737-55c88fcf77f1/manager/0.log" Mar 18 19:32:37 crc kubenswrapper[5008]: I0318 19:32:37.532219 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-fbwjk_c60b7b7f-31b3-49da-b7a3-9559345c180b/manager/0.log" Mar 18 19:32:37 crc kubenswrapper[5008]: I0318 19:32:37.699093 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-6pf8x_771f6d56-9294-45d0-bf92-5b59be4313bf/manager/0.log" Mar 18 19:32:37 crc kubenswrapper[5008]: I0318 19:32:37.800057 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-9w9g2_91590da1-e8d8-47d0-8737-83b39da9214f/manager/0.log" Mar 18 19:32:37 crc kubenswrapper[5008]: I0318 19:32:37.890087 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-k8vzn_ccf8ba7b-3e8e-4b28-9705-26a812edfc07/manager/0.log" Mar 18 19:32:37 crc kubenswrapper[5008]: I0318 19:32:37.956096 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-74c4796899mb8mj_d5e131ed-11af-4026-a4d2-7a25e42e38c9/manager/0.log" Mar 18 19:32:38 crc kubenswrapper[5008]: I0318 19:32:38.193498 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-b85c4d696-mrt6s_f80b85c8-6f72-4dc2-bbfa-b03c0371597a/operator/0.log" Mar 18 19:32:38 crc kubenswrapper[5008]: I0318 19:32:38.444322 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hrzg6_45bb4ac1-daae-4412-8c26-38429a1f1182/registry-server/0.log" Mar 18 19:32:38 crc kubenswrapper[5008]: I0318 19:32:38.508076 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-x5ghn_847f8b5e-2233-4fb5-964b-2df8309401d6/manager/0.log" Mar 18 19:32:38 crc kubenswrapper[5008]: I0318 19:32:38.632421 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-hg8kc_5c650888-8d3a-4835-bdc0-2686f8881f62/manager/0.log" Mar 18 19:32:38 crc kubenswrapper[5008]: I0318 19:32:38.837093 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-nlftd_280f2ba7-134f-472b-82f4-d3728bbe6d31/operator/0.log" Mar 18 19:32:38 crc kubenswrapper[5008]: I0318 19:32:38.940183 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-rjdmt_3b9302e6-1019-49f3-a708-d8552045764e/manager/0.log" Mar 18 19:32:38 crc kubenswrapper[5008]: I0318 19:32:38.996375 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-86bd8996f6-twfsv_002df576-b6e1-4ffd-9eda-5751dcf89505/manager/0.log" Mar 18 19:32:39 crc kubenswrapper[5008]: I0318 19:32:39.132015 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-g6vj6_da6e231d-9d56-4c1c-a10e-b4e258a88c2a/manager/0.log" Mar 18 19:32:39 crc kubenswrapper[5008]: I0318 19:32:39.182303 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-kqmf8_8d3f1117-f02f-406b-bae8-84e5e71212c1/manager/0.log" Mar 18 19:32:39 crc kubenswrapper[5008]: I0318 19:32:39.239145 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-2v4j6_1048d1cb-c6cf-47dd-8896-8445f09e6d25/manager/0.log" Mar 18 19:32:45 crc kubenswrapper[5008]: I0318 19:32:45.120112 5008 scope.go:117] "RemoveContainer" containerID="4a642dbc62e9f7a8a7cc2d5122326b8d39353cc03ba3972772ecf36c4cb6c0fd" Mar 18 19:32:59 crc kubenswrapper[5008]: I0318 19:32:59.024850 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ptkj5_87181d7a-94d9-4918-99d6-0fa95896bc05/control-plane-machine-set-operator/0.log" Mar 18 19:32:59 crc kubenswrapper[5008]: I0318 19:32:59.212406 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qnsxl_137dc523-385f-4afb-b972-66093e2e071e/kube-rbac-proxy/0.log" Mar 18 19:32:59 crc kubenswrapper[5008]: I0318 19:32:59.245863 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qnsxl_137dc523-385f-4afb-b972-66093e2e071e/machine-api-operator/0.log" Mar 18 19:33:11 crc kubenswrapper[5008]: I0318 19:33:11.082063 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-q986z"] Mar 18 19:33:11 crc kubenswrapper[5008]: I0318 19:33:11.096528 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-q986z"] Mar 18 19:33:12 crc kubenswrapper[5008]: I0318 19:33:12.209728 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35246fdb-28bc-4e43-af93-b4171a374147" path="/var/lib/kubelet/pods/35246fdb-28bc-4e43-af93-b4171a374147/volumes" Mar 18 19:33:12 crc kubenswrapper[5008]: I0318 19:33:12.642101 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-7zrmj_dffaebea-0338-4742-ab2a-801b071ae679/cert-manager-controller/0.log" Mar 18 19:33:12 crc kubenswrapper[5008]: I0318 19:33:12.743960 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-8m92h_96b8dcae-2676-4c66-9902-ea0976801d41/cert-manager-cainjector/0.log" Mar 18 19:33:12 crc kubenswrapper[5008]: I0318 19:33:12.824923 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-2wx9n_edaf5687-b4ea-4a5c-a4f1-7f26a9e2b2eb/cert-manager-webhook/0.log" Mar 18 19:33:24 crc kubenswrapper[5008]: I0318 19:33:24.460567 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:33:24 crc kubenswrapper[5008]: I0318 19:33:24.461242 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:33:25 crc kubenswrapper[5008]: I0318 19:33:25.908275 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-p95vq_72afd8eb-d07a-4828-ab98-c094097c937d/nmstate-console-plugin/0.log" Mar 18 19:33:26 crc kubenswrapper[5008]: I0318 19:33:26.091919 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dfm4p_c698aa79-da33-410b-86db-38e9fd3a4806/nmstate-handler/0.log" Mar 18 19:33:26 crc kubenswrapper[5008]: I0318 19:33:26.103061 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cqxqv_ed341cb5-f441-4f05-951b-973883b19672/kube-rbac-proxy/0.log" Mar 18 19:33:26 crc kubenswrapper[5008]: I0318 19:33:26.190909 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-cqxqv_ed341cb5-f441-4f05-951b-973883b19672/nmstate-metrics/0.log" Mar 18 19:33:26 crc kubenswrapper[5008]: I0318 19:33:26.260301 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-jq4ln_30844b92-1089-4729-9e30-38cb366fbe0f/nmstate-operator/0.log" Mar 18 19:33:26 crc kubenswrapper[5008]: I0318 19:33:26.395317 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-vnfv5_eb0f4cc1-7468-434f-bc63-7c3575621186/nmstate-webhook/0.log" Mar 18 19:33:45 crc kubenswrapper[5008]: I0318 19:33:45.202701 5008 scope.go:117] "RemoveContainer" containerID="3a4dc2f4dea0d98b56e5a3ba1287d14dd64b851dcbc4f5cf708b04cf12ff3c4c" Mar 18 19:33:54 crc kubenswrapper[5008]: I0318 19:33:54.459887 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:33:54 crc kubenswrapper[5008]: I0318 19:33:54.460375 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:33:55 crc kubenswrapper[5008]: I0318 19:33:55.569992 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q5drs_652e9f56-c4d2-493b-bc68-11fd5cff1657/kube-rbac-proxy/0.log" Mar 18 19:33:55 crc kubenswrapper[5008]: I0318 19:33:55.803782 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-frr-files/0.log" Mar 18 19:33:55 crc kubenswrapper[5008]: I0318 19:33:55.932464 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q5drs_652e9f56-c4d2-493b-bc68-11fd5cff1657/controller/0.log" Mar 18 19:33:55 crc kubenswrapper[5008]: I0318 19:33:55.999738 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-frr-files/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.025935 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-reloader/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.042979 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-metrics/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.107487 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-reloader/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.289573 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-frr-files/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.325738 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-reloader/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.348644 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-metrics/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.477811 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-metrics/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.641836 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-frr-files/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.689135 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-metrics/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.696541 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/cp-reloader/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.720939 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/controller/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.913573 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/kube-rbac-proxy/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.976363 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/kube-rbac-proxy-frr/0.log" Mar 18 19:33:56 crc kubenswrapper[5008]: I0318 19:33:56.987637 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/frr-metrics/0.log" Mar 18 19:33:57 crc kubenswrapper[5008]: I0318 19:33:57.129457 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/reloader/0.log" Mar 18 19:33:57 crc kubenswrapper[5008]: I0318 19:33:57.256747 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-5ctcq_b7ac363f-dcd1-43df-a280-45ded08e9446/frr-k8s-webhook-server/0.log" Mar 18 19:33:57 crc kubenswrapper[5008]: I0318 19:33:57.458540 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84bb8d8d7b-wj6kv_fd83a3e9-013e-469d-9f55-fcc1197ff19c/manager/0.log" Mar 18 19:33:57 crc kubenswrapper[5008]: I0318 19:33:57.618511 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66dd9db899-r984g_8257155e-9c34-4066-99b6-9c112804ee23/webhook-server/0.log" Mar 18 19:33:57 crc kubenswrapper[5008]: I0318 19:33:57.817830 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pdjkn_44eda89c-4b47-46f6-a60f-a439d8721b2f/kube-rbac-proxy/0.log" Mar 18 19:33:58 crc kubenswrapper[5008]: I0318 19:33:58.299003 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pdjkn_44eda89c-4b47-46f6-a60f-a439d8721b2f/speaker/0.log" Mar 18 19:33:58 crc kubenswrapper[5008]: I0318 19:33:58.659351 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnvl6_32700067-407c-41a0-8d49-835fd75bb28d/frr/0.log" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.169073 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564374-djzhs"] Mar 18 19:34:00 crc kubenswrapper[5008]: E0318 19:34:00.169963 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589022c5-86f2-4a77-8f2b-65bec7c5aa76" containerName="container-00" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.169980 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="589022c5-86f2-4a77-8f2b-65bec7c5aa76" containerName="container-00" Mar 18 19:34:00 crc kubenswrapper[5008]: E0318 19:34:00.170002 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b" containerName="oc" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.170009 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b" containerName="oc" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.170208 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="589022c5-86f2-4a77-8f2b-65bec7c5aa76" containerName="container-00" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.170225 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b" containerName="oc" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.170946 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564374-djzhs" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.173329 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.174224 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.174278 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.210743 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564374-djzhs"] Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.350360 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fpdk\" (UniqueName: \"kubernetes.io/projected/70f82731-0d3b-4ad6-971b-d6dc37970d4c-kube-api-access-8fpdk\") pod \"auto-csr-approver-29564374-djzhs\" (UID: \"70f82731-0d3b-4ad6-971b-d6dc37970d4c\") " pod="openshift-infra/auto-csr-approver-29564374-djzhs" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.452316 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fpdk\" (UniqueName: \"kubernetes.io/projected/70f82731-0d3b-4ad6-971b-d6dc37970d4c-kube-api-access-8fpdk\") pod \"auto-csr-approver-29564374-djzhs\" (UID: \"70f82731-0d3b-4ad6-971b-d6dc37970d4c\") " pod="openshift-infra/auto-csr-approver-29564374-djzhs" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.485768 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fpdk\" (UniqueName: \"kubernetes.io/projected/70f82731-0d3b-4ad6-971b-d6dc37970d4c-kube-api-access-8fpdk\") pod \"auto-csr-approver-29564374-djzhs\" (UID: \"70f82731-0d3b-4ad6-971b-d6dc37970d4c\") " pod="openshift-infra/auto-csr-approver-29564374-djzhs" Mar 18 19:34:00 crc kubenswrapper[5008]: I0318 19:34:00.494717 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564374-djzhs" Mar 18 19:34:01 crc kubenswrapper[5008]: I0318 19:34:01.099891 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564374-djzhs"] Mar 18 19:34:01 crc kubenswrapper[5008]: I0318 19:34:01.609003 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564374-djzhs" event={"ID":"70f82731-0d3b-4ad6-971b-d6dc37970d4c","Type":"ContainerStarted","Data":"8c4fe7026ab656f02f5c51c1135a1430c313907eef7c94e7977ebb5efff1cfc2"} Mar 18 19:34:02 crc kubenswrapper[5008]: I0318 19:34:02.618739 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564374-djzhs" event={"ID":"70f82731-0d3b-4ad6-971b-d6dc37970d4c","Type":"ContainerStarted","Data":"281ab35a0e5a4d169851dcbf9b2fd72841ae89315e28724b9932fd03194d0581"} Mar 18 19:34:02 crc kubenswrapper[5008]: I0318 19:34:02.635388 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564374-djzhs" podStartSLOduration=1.6727767839999999 podStartE2EDuration="2.635374267s" podCreationTimestamp="2026-03-18 19:34:00 +0000 UTC" firstStartedPulling="2026-03-18 19:34:01.123793059 +0000 UTC m=+5497.643266148" lastFinishedPulling="2026-03-18 19:34:02.086390522 +0000 UTC m=+5498.605863631" observedRunningTime="2026-03-18 19:34:02.631409903 +0000 UTC m=+5499.150882982" watchObservedRunningTime="2026-03-18 19:34:02.635374267 +0000 UTC m=+5499.154847346" Mar 18 19:34:03 crc kubenswrapper[5008]: I0318 19:34:03.627877 5008 generic.go:334] "Generic (PLEG): container finished" podID="70f82731-0d3b-4ad6-971b-d6dc37970d4c" containerID="281ab35a0e5a4d169851dcbf9b2fd72841ae89315e28724b9932fd03194d0581" exitCode=0 Mar 18 19:34:03 crc kubenswrapper[5008]: I0318 19:34:03.627919 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564374-djzhs" event={"ID":"70f82731-0d3b-4ad6-971b-d6dc37970d4c","Type":"ContainerDied","Data":"281ab35a0e5a4d169851dcbf9b2fd72841ae89315e28724b9932fd03194d0581"} Mar 18 19:34:04 crc kubenswrapper[5008]: I0318 19:34:04.997662 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564374-djzhs" Mar 18 19:34:05 crc kubenswrapper[5008]: I0318 19:34:05.144662 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fpdk\" (UniqueName: \"kubernetes.io/projected/70f82731-0d3b-4ad6-971b-d6dc37970d4c-kube-api-access-8fpdk\") pod \"70f82731-0d3b-4ad6-971b-d6dc37970d4c\" (UID: \"70f82731-0d3b-4ad6-971b-d6dc37970d4c\") " Mar 18 19:34:05 crc kubenswrapper[5008]: I0318 19:34:05.149885 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f82731-0d3b-4ad6-971b-d6dc37970d4c-kube-api-access-8fpdk" (OuterVolumeSpecName: "kube-api-access-8fpdk") pod "70f82731-0d3b-4ad6-971b-d6dc37970d4c" (UID: "70f82731-0d3b-4ad6-971b-d6dc37970d4c"). InnerVolumeSpecName "kube-api-access-8fpdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:34:05 crc kubenswrapper[5008]: I0318 19:34:05.246608 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fpdk\" (UniqueName: \"kubernetes.io/projected/70f82731-0d3b-4ad6-971b-d6dc37970d4c-kube-api-access-8fpdk\") on node \"crc\" DevicePath \"\"" Mar 18 19:34:05 crc kubenswrapper[5008]: I0318 19:34:05.650735 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564374-djzhs" event={"ID":"70f82731-0d3b-4ad6-971b-d6dc37970d4c","Type":"ContainerDied","Data":"8c4fe7026ab656f02f5c51c1135a1430c313907eef7c94e7977ebb5efff1cfc2"} Mar 18 19:34:05 crc kubenswrapper[5008]: I0318 19:34:05.650828 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c4fe7026ab656f02f5c51c1135a1430c313907eef7c94e7977ebb5efff1cfc2" Mar 18 19:34:05 crc kubenswrapper[5008]: I0318 19:34:05.650952 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564374-djzhs" Mar 18 19:34:05 crc kubenswrapper[5008]: I0318 19:34:05.726673 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564368-pw579"] Mar 18 19:34:05 crc kubenswrapper[5008]: I0318 19:34:05.734538 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564368-pw579"] Mar 18 19:34:06 crc kubenswrapper[5008]: I0318 19:34:06.210352 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c2a2b0-1576-4984-b1f3-97799587fccc" path="/var/lib/kubelet/pods/36c2a2b0-1576-4984-b1f3-97799587fccc/volumes" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.039784 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q_04389aab-de3f-49c6-ab43-d978cd41f38d/util/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.351434 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q_04389aab-de3f-49c6-ab43-d978cd41f38d/util/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.354053 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q_04389aab-de3f-49c6-ab43-d978cd41f38d/pull/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.387206 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q_04389aab-de3f-49c6-ab43-d978cd41f38d/pull/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.543641 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q_04389aab-de3f-49c6-ab43-d978cd41f38d/util/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.549883 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q_04389aab-de3f-49c6-ab43-d978cd41f38d/pull/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.554849 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hks9q_04389aab-de3f-49c6-ab43-d978cd41f38d/extract/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.701332 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp_a9b926c7-726b-4b59-b7de-0939b357dbc2/util/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.853722 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp_a9b926c7-726b-4b59-b7de-0939b357dbc2/util/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.862892 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp_a9b926c7-726b-4b59-b7de-0939b357dbc2/pull/0.log" Mar 18 19:34:13 crc kubenswrapper[5008]: I0318 19:34:13.893825 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp_a9b926c7-726b-4b59-b7de-0939b357dbc2/pull/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.008402 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp_a9b926c7-726b-4b59-b7de-0939b357dbc2/util/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.015017 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp_a9b926c7-726b-4b59-b7de-0939b357dbc2/extract/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.026701 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1kt7bp_a9b926c7-726b-4b59-b7de-0939b357dbc2/pull/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.191966 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7_0a50008d-38c2-4192-8a3f-8e18d2e0ef41/util/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.366409 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7_0a50008d-38c2-4192-8a3f-8e18d2e0ef41/util/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.406615 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7_0a50008d-38c2-4192-8a3f-8e18d2e0ef41/pull/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.420096 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7_0a50008d-38c2-4192-8a3f-8e18d2e0ef41/pull/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.536852 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7_0a50008d-38c2-4192-8a3f-8e18d2e0ef41/util/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.556977 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7_0a50008d-38c2-4192-8a3f-8e18d2e0ef41/pull/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.585214 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54z8q7_0a50008d-38c2-4192-8a3f-8e18d2e0ef41/extract/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.739129 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhkrf_069bee86-2582-433b-a7e9-59fa22bb650d/extract-utilities/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.923767 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhkrf_069bee86-2582-433b-a7e9-59fa22bb650d/extract-content/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.924030 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhkrf_069bee86-2582-433b-a7e9-59fa22bb650d/extract-utilities/0.log" Mar 18 19:34:14 crc kubenswrapper[5008]: I0318 19:34:14.939593 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhkrf_069bee86-2582-433b-a7e9-59fa22bb650d/extract-content/0.log" Mar 18 19:34:15 crc kubenswrapper[5008]: I0318 19:34:15.150048 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhkrf_069bee86-2582-433b-a7e9-59fa22bb650d/extract-content/0.log" Mar 18 19:34:15 crc kubenswrapper[5008]: I0318 19:34:15.173421 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhkrf_069bee86-2582-433b-a7e9-59fa22bb650d/extract-utilities/0.log" Mar 18 19:34:15 crc kubenswrapper[5008]: I0318 19:34:15.387490 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qz9gd_be98c2c7-e6a9-4459-9693-6c05374daff8/extract-utilities/0.log" Mar 18 19:34:15 crc kubenswrapper[5008]: I0318 19:34:15.556587 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qz9gd_be98c2c7-e6a9-4459-9693-6c05374daff8/extract-utilities/0.log" Mar 18 19:34:15 crc kubenswrapper[5008]: I0318 19:34:15.619528 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qz9gd_be98c2c7-e6a9-4459-9693-6c05374daff8/extract-content/0.log" Mar 18 19:34:15 crc kubenswrapper[5008]: I0318 19:34:15.655264 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qz9gd_be98c2c7-e6a9-4459-9693-6c05374daff8/extract-content/0.log" Mar 18 19:34:15 crc kubenswrapper[5008]: I0318 19:34:15.782858 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dhkrf_069bee86-2582-433b-a7e9-59fa22bb650d/registry-server/0.log" Mar 18 19:34:15 crc kubenswrapper[5008]: I0318 19:34:15.885182 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qz9gd_be98c2c7-e6a9-4459-9693-6c05374daff8/extract-utilities/0.log" Mar 18 19:34:15 crc kubenswrapper[5008]: I0318 19:34:15.888091 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qz9gd_be98c2c7-e6a9-4459-9693-6c05374daff8/extract-content/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.087377 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-mvtrx_36bc5b4f-3a28-4f99-9c2d-e521639c546a/marketplace-operator/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.249781 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nfbpl_15786344-bb69-48e3-8bc9-e7fea7106bc8/extract-utilities/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.494821 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nfbpl_15786344-bb69-48e3-8bc9-e7fea7106bc8/extract-content/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.516146 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nfbpl_15786344-bb69-48e3-8bc9-e7fea7106bc8/extract-utilities/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.549582 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nfbpl_15786344-bb69-48e3-8bc9-e7fea7106bc8/extract-content/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.577625 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qz9gd_be98c2c7-e6a9-4459-9693-6c05374daff8/registry-server/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.737177 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nfbpl_15786344-bb69-48e3-8bc9-e7fea7106bc8/extract-utilities/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.756678 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nfbpl_15786344-bb69-48e3-8bc9-e7fea7106bc8/extract-content/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.874329 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nfbpl_15786344-bb69-48e3-8bc9-e7fea7106bc8/registry-server/0.log" Mar 18 19:34:16 crc kubenswrapper[5008]: I0318 19:34:16.884040 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jddwj_b857b251-c73d-4153-8155-4ddd0703759b/extract-utilities/0.log" Mar 18 19:34:17 crc kubenswrapper[5008]: I0318 19:34:17.133532 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jddwj_b857b251-c73d-4153-8155-4ddd0703759b/extract-utilities/0.log" Mar 18 19:34:17 crc kubenswrapper[5008]: I0318 19:34:17.153540 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jddwj_b857b251-c73d-4153-8155-4ddd0703759b/extract-content/0.log" Mar 18 19:34:17 crc kubenswrapper[5008]: I0318 19:34:17.170845 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jddwj_b857b251-c73d-4153-8155-4ddd0703759b/extract-content/0.log" Mar 18 19:34:17 crc kubenswrapper[5008]: I0318 19:34:17.314690 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jddwj_b857b251-c73d-4153-8155-4ddd0703759b/extract-utilities/0.log" Mar 18 19:34:17 crc kubenswrapper[5008]: I0318 19:34:17.368595 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jddwj_b857b251-c73d-4153-8155-4ddd0703759b/extract-content/0.log" Mar 18 19:34:17 crc kubenswrapper[5008]: I0318 19:34:17.942736 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jddwj_b857b251-c73d-4153-8155-4ddd0703759b/registry-server/0.log" Mar 18 19:34:24 crc kubenswrapper[5008]: I0318 19:34:24.460693 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:34:24 crc kubenswrapper[5008]: I0318 19:34:24.461299 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:34:24 crc kubenswrapper[5008]: I0318 19:34:24.461360 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 19:34:24 crc kubenswrapper[5008]: I0318 19:34:24.462205 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ed354de23509ced81330dce6064f0e54cdb4d741d3f848d673994d0c464ccba"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:34:24 crc kubenswrapper[5008]: I0318 19:34:24.462291 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://4ed354de23509ced81330dce6064f0e54cdb4d741d3f848d673994d0c464ccba" gracePeriod=600 Mar 18 19:34:24 crc kubenswrapper[5008]: I0318 19:34:24.829839 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="4ed354de23509ced81330dce6064f0e54cdb4d741d3f848d673994d0c464ccba" exitCode=0 Mar 18 19:34:24 crc kubenswrapper[5008]: I0318 19:34:24.829878 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"4ed354de23509ced81330dce6064f0e54cdb4d741d3f848d673994d0c464ccba"} Mar 18 19:34:24 crc kubenswrapper[5008]: I0318 19:34:24.830196 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerStarted","Data":"bb68cfa01f563fe658c037b538850cc91db5ca75954107e62dbb9316352fbb53"} Mar 18 19:34:24 crc kubenswrapper[5008]: I0318 19:34:24.830214 5008 scope.go:117] "RemoveContainer" containerID="1315d1359d95e5665e482b92e1c83fe56d94e3cbc87d6a0b678bb35f4b35dbe7" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.218247 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9qfx9"] Mar 18 19:34:42 crc kubenswrapper[5008]: E0318 19:34:42.219145 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f82731-0d3b-4ad6-971b-d6dc37970d4c" containerName="oc" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.219161 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f82731-0d3b-4ad6-971b-d6dc37970d4c" containerName="oc" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.219370 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f82731-0d3b-4ad6-971b-d6dc37970d4c" containerName="oc" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.221099 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.230169 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qfx9"] Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.362023 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-catalog-content\") pod \"certified-operators-9qfx9\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.362115 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb69p\" (UniqueName: \"kubernetes.io/projected/dc64655d-15e0-4338-bad2-081ff6831d17-kube-api-access-tb69p\") pod \"certified-operators-9qfx9\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.362151 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-utilities\") pod \"certified-operators-9qfx9\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.463644 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb69p\" (UniqueName: \"kubernetes.io/projected/dc64655d-15e0-4338-bad2-081ff6831d17-kube-api-access-tb69p\") pod \"certified-operators-9qfx9\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.464081 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-utilities\") pod \"certified-operators-9qfx9\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.464176 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-catalog-content\") pod \"certified-operators-9qfx9\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.464626 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-catalog-content\") pod \"certified-operators-9qfx9\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.464781 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-utilities\") pod \"certified-operators-9qfx9\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.484084 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb69p\" (UniqueName: \"kubernetes.io/projected/dc64655d-15e0-4338-bad2-081ff6831d17-kube-api-access-tb69p\") pod \"certified-operators-9qfx9\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.545040 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.878945 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qfx9"] Mar 18 19:34:42 crc kubenswrapper[5008]: W0318 19:34:42.883658 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc64655d_15e0_4338_bad2_081ff6831d17.slice/crio-39c036db75090cfecc18fc5f22d6fd89f181884f7898b37bc5cd60728593d8e1 WatchSource:0}: Error finding container 39c036db75090cfecc18fc5f22d6fd89f181884f7898b37bc5cd60728593d8e1: Status 404 returned error can't find the container with id 39c036db75090cfecc18fc5f22d6fd89f181884f7898b37bc5cd60728593d8e1 Mar 18 19:34:42 crc kubenswrapper[5008]: I0318 19:34:42.991762 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qfx9" event={"ID":"dc64655d-15e0-4338-bad2-081ff6831d17","Type":"ContainerStarted","Data":"39c036db75090cfecc18fc5f22d6fd89f181884f7898b37bc5cd60728593d8e1"} Mar 18 19:34:44 crc kubenswrapper[5008]: I0318 19:34:44.029619 5008 generic.go:334] "Generic (PLEG): container finished" podID="dc64655d-15e0-4338-bad2-081ff6831d17" containerID="de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d" exitCode=0 Mar 18 19:34:44 crc kubenswrapper[5008]: I0318 19:34:44.029691 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qfx9" event={"ID":"dc64655d-15e0-4338-bad2-081ff6831d17","Type":"ContainerDied","Data":"de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d"} Mar 18 19:34:45 crc kubenswrapper[5008]: I0318 19:34:45.280109 5008 scope.go:117] "RemoveContainer" containerID="941d08f5ef792c7e42bc110c9d8a0641b732abcc5d370590321d68789a5f7d62" Mar 18 19:34:46 crc kubenswrapper[5008]: I0318 19:34:46.044888 5008 generic.go:334] "Generic (PLEG): container finished" podID="dc64655d-15e0-4338-bad2-081ff6831d17" containerID="288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a" exitCode=0 Mar 18 19:34:46 crc kubenswrapper[5008]: I0318 19:34:46.044921 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qfx9" event={"ID":"dc64655d-15e0-4338-bad2-081ff6831d17","Type":"ContainerDied","Data":"288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a"} Mar 18 19:34:47 crc kubenswrapper[5008]: I0318 19:34:47.052634 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qfx9" event={"ID":"dc64655d-15e0-4338-bad2-081ff6831d17","Type":"ContainerStarted","Data":"edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1"} Mar 18 19:34:52 crc kubenswrapper[5008]: I0318 19:34:52.545596 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:52 crc kubenswrapper[5008]: I0318 19:34:52.546174 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:52 crc kubenswrapper[5008]: I0318 19:34:52.615436 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:52 crc kubenswrapper[5008]: I0318 19:34:52.638161 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9qfx9" podStartSLOduration=8.206680626 podStartE2EDuration="10.638134296s" podCreationTimestamp="2026-03-18 19:34:42 +0000 UTC" firstStartedPulling="2026-03-18 19:34:44.037628854 +0000 UTC m=+5540.557101973" lastFinishedPulling="2026-03-18 19:34:46.469082554 +0000 UTC m=+5542.988555643" observedRunningTime="2026-03-18 19:34:47.072654412 +0000 UTC m=+5543.592127501" watchObservedRunningTime="2026-03-18 19:34:52.638134296 +0000 UTC m=+5549.157607385" Mar 18 19:34:53 crc kubenswrapper[5008]: I0318 19:34:53.130521 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:53 crc kubenswrapper[5008]: I0318 19:34:53.174971 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qfx9"] Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.116850 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9qfx9" podUID="dc64655d-15e0-4338-bad2-081ff6831d17" containerName="registry-server" containerID="cri-o://edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1" gracePeriod=2 Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.634693 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.717167 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb69p\" (UniqueName: \"kubernetes.io/projected/dc64655d-15e0-4338-bad2-081ff6831d17-kube-api-access-tb69p\") pod \"dc64655d-15e0-4338-bad2-081ff6831d17\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.717225 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-catalog-content\") pod \"dc64655d-15e0-4338-bad2-081ff6831d17\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.717286 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-utilities\") pod \"dc64655d-15e0-4338-bad2-081ff6831d17\" (UID: \"dc64655d-15e0-4338-bad2-081ff6831d17\") " Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.718364 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-utilities" (OuterVolumeSpecName: "utilities") pod "dc64655d-15e0-4338-bad2-081ff6831d17" (UID: "dc64655d-15e0-4338-bad2-081ff6831d17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.724249 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc64655d-15e0-4338-bad2-081ff6831d17-kube-api-access-tb69p" (OuterVolumeSpecName: "kube-api-access-tb69p") pod "dc64655d-15e0-4338-bad2-081ff6831d17" (UID: "dc64655d-15e0-4338-bad2-081ff6831d17"). InnerVolumeSpecName "kube-api-access-tb69p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.785495 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc64655d-15e0-4338-bad2-081ff6831d17" (UID: "dc64655d-15e0-4338-bad2-081ff6831d17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.819305 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb69p\" (UniqueName: \"kubernetes.io/projected/dc64655d-15e0-4338-bad2-081ff6831d17-kube-api-access-tb69p\") on node \"crc\" DevicePath \"\"" Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.819333 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:34:55 crc kubenswrapper[5008]: I0318 19:34:55.819342 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc64655d-15e0-4338-bad2-081ff6831d17-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.126649 5008 generic.go:334] "Generic (PLEG): container finished" podID="dc64655d-15e0-4338-bad2-081ff6831d17" containerID="edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1" exitCode=0 Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.126704 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qfx9" event={"ID":"dc64655d-15e0-4338-bad2-081ff6831d17","Type":"ContainerDied","Data":"edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1"} Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.126742 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qfx9" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.126945 5008 scope.go:117] "RemoveContainer" containerID="edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.126929 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qfx9" event={"ID":"dc64655d-15e0-4338-bad2-081ff6831d17","Type":"ContainerDied","Data":"39c036db75090cfecc18fc5f22d6fd89f181884f7898b37bc5cd60728593d8e1"} Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.147652 5008 scope.go:117] "RemoveContainer" containerID="288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.179695 5008 scope.go:117] "RemoveContainer" containerID="de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.179869 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qfx9"] Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.195338 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9qfx9"] Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.209364 5008 scope.go:117] "RemoveContainer" containerID="edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1" Mar 18 19:34:56 crc kubenswrapper[5008]: E0318 19:34:56.209941 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1\": container with ID starting with edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1 not found: ID does not exist" containerID="edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.209970 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1"} err="failed to get container status \"edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1\": rpc error: code = NotFound desc = could not find container \"edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1\": container with ID starting with edeee6dc113326357be2a1c056de956981802b38a695d311a7cb7e0993a06dd1 not found: ID does not exist" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.209991 5008 scope.go:117] "RemoveContainer" containerID="288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a" Mar 18 19:34:56 crc kubenswrapper[5008]: E0318 19:34:56.210332 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a\": container with ID starting with 288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a not found: ID does not exist" containerID="288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.210487 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a"} err="failed to get container status \"288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a\": rpc error: code = NotFound desc = could not find container \"288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a\": container with ID starting with 288ad948c41cfec4b9d162d091c5ba5917ab54dfb0e0179b6d82af04eee94a9a not found: ID does not exist" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.210615 5008 scope.go:117] "RemoveContainer" containerID="de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d" Mar 18 19:34:56 crc kubenswrapper[5008]: E0318 19:34:56.211203 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d\": container with ID starting with de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d not found: ID does not exist" containerID="de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.211233 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d"} err="failed to get container status \"de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d\": rpc error: code = NotFound desc = could not find container \"de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d\": container with ID starting with de384f88ada8fd078038d0876c7b922fff4e71ab5ddb1fa8ace67c69ea46d70d not found: ID does not exist" Mar 18 19:34:56 crc kubenswrapper[5008]: I0318 19:34:56.218908 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc64655d-15e0-4338-bad2-081ff6831d17" path="/var/lib/kubelet/pods/dc64655d-15e0-4338-bad2-081ff6831d17/volumes" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.805424 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-plxq8"] Mar 18 19:35:34 crc kubenswrapper[5008]: E0318 19:35:34.806296 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc64655d-15e0-4338-bad2-081ff6831d17" containerName="extract-content" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.806310 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc64655d-15e0-4338-bad2-081ff6831d17" containerName="extract-content" Mar 18 19:35:34 crc kubenswrapper[5008]: E0318 19:35:34.806321 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc64655d-15e0-4338-bad2-081ff6831d17" containerName="extract-utilities" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.806327 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc64655d-15e0-4338-bad2-081ff6831d17" containerName="extract-utilities" Mar 18 19:35:34 crc kubenswrapper[5008]: E0318 19:35:34.806349 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc64655d-15e0-4338-bad2-081ff6831d17" containerName="registry-server" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.806355 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc64655d-15e0-4338-bad2-081ff6831d17" containerName="registry-server" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.806517 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc64655d-15e0-4338-bad2-081ff6831d17" containerName="registry-server" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.807710 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.827415 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-utilities\") pod \"community-operators-plxq8\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.827656 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cht46\" (UniqueName: \"kubernetes.io/projected/6140f505-4751-45cc-98bc-c2e55e84ab6a-kube-api-access-cht46\") pod \"community-operators-plxq8\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.827735 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-catalog-content\") pod \"community-operators-plxq8\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.835385 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-plxq8"] Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.930116 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-catalog-content\") pod \"community-operators-plxq8\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.930247 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-utilities\") pod \"community-operators-plxq8\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.930443 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cht46\" (UniqueName: \"kubernetes.io/projected/6140f505-4751-45cc-98bc-c2e55e84ab6a-kube-api-access-cht46\") pod \"community-operators-plxq8\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.930799 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-catalog-content\") pod \"community-operators-plxq8\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.931161 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-utilities\") pod \"community-operators-plxq8\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:34 crc kubenswrapper[5008]: I0318 19:35:34.955237 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cht46\" (UniqueName: \"kubernetes.io/projected/6140f505-4751-45cc-98bc-c2e55e84ab6a-kube-api-access-cht46\") pod \"community-operators-plxq8\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:35 crc kubenswrapper[5008]: I0318 19:35:35.134876 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:35 crc kubenswrapper[5008]: I0318 19:35:35.622697 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-plxq8"] Mar 18 19:35:36 crc kubenswrapper[5008]: I0318 19:35:36.525309 5008 generic.go:334] "Generic (PLEG): container finished" podID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerID="632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862" exitCode=0 Mar 18 19:35:36 crc kubenswrapper[5008]: I0318 19:35:36.525469 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plxq8" event={"ID":"6140f505-4751-45cc-98bc-c2e55e84ab6a","Type":"ContainerDied","Data":"632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862"} Mar 18 19:35:36 crc kubenswrapper[5008]: I0318 19:35:36.525621 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plxq8" event={"ID":"6140f505-4751-45cc-98bc-c2e55e84ab6a","Type":"ContainerStarted","Data":"b9a42e7632d09ad8d3d420997d7bce605af38daf97cf02234cf9dc5f69a467f3"} Mar 18 19:35:36 crc kubenswrapper[5008]: I0318 19:35:36.528046 5008 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 19:35:37 crc kubenswrapper[5008]: I0318 19:35:37.534753 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plxq8" event={"ID":"6140f505-4751-45cc-98bc-c2e55e84ab6a","Type":"ContainerStarted","Data":"aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b"} Mar 18 19:35:38 crc kubenswrapper[5008]: I0318 19:35:38.545585 5008 generic.go:334] "Generic (PLEG): container finished" podID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerID="aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b" exitCode=0 Mar 18 19:35:38 crc kubenswrapper[5008]: I0318 19:35:38.545631 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plxq8" event={"ID":"6140f505-4751-45cc-98bc-c2e55e84ab6a","Type":"ContainerDied","Data":"aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b"} Mar 18 19:35:39 crc kubenswrapper[5008]: I0318 19:35:39.556150 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plxq8" event={"ID":"6140f505-4751-45cc-98bc-c2e55e84ab6a","Type":"ContainerStarted","Data":"62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475"} Mar 18 19:35:40 crc kubenswrapper[5008]: I0318 19:35:40.568692 5008 generic.go:334] "Generic (PLEG): container finished" podID="8c131199-f3fc-473d-be6f-7f3f42b8c542" containerID="357cd3737a0fc9c7dbe2d52cefb39b0b126e017157debbb33ce559623517edac" exitCode=0 Mar 18 19:35:40 crc kubenswrapper[5008]: I0318 19:35:40.568793 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnkpt/must-gather-5trpz" event={"ID":"8c131199-f3fc-473d-be6f-7f3f42b8c542","Type":"ContainerDied","Data":"357cd3737a0fc9c7dbe2d52cefb39b0b126e017157debbb33ce559623517edac"} Mar 18 19:35:40 crc kubenswrapper[5008]: I0318 19:35:40.569412 5008 scope.go:117] "RemoveContainer" containerID="357cd3737a0fc9c7dbe2d52cefb39b0b126e017157debbb33ce559623517edac" Mar 18 19:35:40 crc kubenswrapper[5008]: I0318 19:35:40.596109 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-plxq8" podStartSLOduration=4.046193077 podStartE2EDuration="6.596078763s" podCreationTimestamp="2026-03-18 19:35:34 +0000 UTC" firstStartedPulling="2026-03-18 19:35:36.527739038 +0000 UTC m=+5593.047212117" lastFinishedPulling="2026-03-18 19:35:39.077624684 +0000 UTC m=+5595.597097803" observedRunningTime="2026-03-18 19:35:39.587654229 +0000 UTC m=+5596.107127318" watchObservedRunningTime="2026-03-18 19:35:40.596078763 +0000 UTC m=+5597.115551882" Mar 18 19:35:40 crc kubenswrapper[5008]: I0318 19:35:40.745699 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vnkpt_must-gather-5trpz_8c131199-f3fc-473d-be6f-7f3f42b8c542/gather/0.log" Mar 18 19:35:45 crc kubenswrapper[5008]: I0318 19:35:45.135827 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:45 crc kubenswrapper[5008]: I0318 19:35:45.136222 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:45 crc kubenswrapper[5008]: I0318 19:35:45.216445 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:45 crc kubenswrapper[5008]: I0318 19:35:45.361534 5008 scope.go:117] "RemoveContainer" containerID="d24911c04aa8db58d1e518078a31609f846fe9efc7614bf06a2f424d4c5b858e" Mar 18 19:35:45 crc kubenswrapper[5008]: I0318 19:35:45.404063 5008 scope.go:117] "RemoveContainer" containerID="7580b00b3d1992c65ce9d70fcf8ee5f49fc6d055bf398aa4aeb6d84c48e8ffce" Mar 18 19:35:45 crc kubenswrapper[5008]: I0318 19:35:45.696958 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:45 crc kubenswrapper[5008]: I0318 19:35:45.770260 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-plxq8"] Mar 18 19:35:47 crc kubenswrapper[5008]: I0318 19:35:47.642903 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-plxq8" podUID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerName="registry-server" containerID="cri-o://62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475" gracePeriod=2 Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.108334 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.269513 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-catalog-content\") pod \"6140f505-4751-45cc-98bc-c2e55e84ab6a\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.269669 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cht46\" (UniqueName: \"kubernetes.io/projected/6140f505-4751-45cc-98bc-c2e55e84ab6a-kube-api-access-cht46\") pod \"6140f505-4751-45cc-98bc-c2e55e84ab6a\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.269695 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-utilities\") pod \"6140f505-4751-45cc-98bc-c2e55e84ab6a\" (UID: \"6140f505-4751-45cc-98bc-c2e55e84ab6a\") " Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.270962 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-utilities" (OuterVolumeSpecName: "utilities") pod "6140f505-4751-45cc-98bc-c2e55e84ab6a" (UID: "6140f505-4751-45cc-98bc-c2e55e84ab6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.290792 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6140f505-4751-45cc-98bc-c2e55e84ab6a-kube-api-access-cht46" (OuterVolumeSpecName: "kube-api-access-cht46") pod "6140f505-4751-45cc-98bc-c2e55e84ab6a" (UID: "6140f505-4751-45cc-98bc-c2e55e84ab6a"). InnerVolumeSpecName "kube-api-access-cht46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.329648 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6140f505-4751-45cc-98bc-c2e55e84ab6a" (UID: "6140f505-4751-45cc-98bc-c2e55e84ab6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.371897 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.372119 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cht46\" (UniqueName: \"kubernetes.io/projected/6140f505-4751-45cc-98bc-c2e55e84ab6a-kube-api-access-cht46\") on node \"crc\" DevicePath \"\"" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.372193 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6140f505-4751-45cc-98bc-c2e55e84ab6a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.381117 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vnkpt/must-gather-5trpz"] Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.383238 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vnkpt/must-gather-5trpz" podUID="8c131199-f3fc-473d-be6f-7f3f42b8c542" containerName="copy" containerID="cri-o://708f496f9ab39dcd85911634502f4126814721f0e02eea2583fabea7b70db17d" gracePeriod=2 Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.391391 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vnkpt/must-gather-5trpz"] Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.659988 5008 generic.go:334] "Generic (PLEG): container finished" podID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerID="62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475" exitCode=0 Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.660089 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plxq8" event={"ID":"6140f505-4751-45cc-98bc-c2e55e84ab6a","Type":"ContainerDied","Data":"62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475"} Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.660124 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plxq8" event={"ID":"6140f505-4751-45cc-98bc-c2e55e84ab6a","Type":"ContainerDied","Data":"b9a42e7632d09ad8d3d420997d7bce605af38daf97cf02234cf9dc5f69a467f3"} Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.660145 5008 scope.go:117] "RemoveContainer" containerID="62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.660341 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plxq8" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.683999 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vnkpt_must-gather-5trpz_8c131199-f3fc-473d-be6f-7f3f42b8c542/copy/0.log" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.685057 5008 generic.go:334] "Generic (PLEG): container finished" podID="8c131199-f3fc-473d-be6f-7f3f42b8c542" containerID="708f496f9ab39dcd85911634502f4126814721f0e02eea2583fabea7b70db17d" exitCode=143 Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.709601 5008 scope.go:117] "RemoveContainer" containerID="aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.729358 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-plxq8"] Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.737063 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-plxq8"] Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.740205 5008 scope.go:117] "RemoveContainer" containerID="632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.745496 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vnkpt_must-gather-5trpz_8c131199-f3fc-473d-be6f-7f3f42b8c542/copy/0.log" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.746292 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.757385 5008 scope.go:117] "RemoveContainer" containerID="62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475" Mar 18 19:35:48 crc kubenswrapper[5008]: E0318 19:35:48.757797 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475\": container with ID starting with 62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475 not found: ID does not exist" containerID="62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.757833 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475"} err="failed to get container status \"62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475\": rpc error: code = NotFound desc = could not find container \"62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475\": container with ID starting with 62dc558c703c48b124b7e27ea99e94191e4cfd234ce6d0c6bffdad86505c0475 not found: ID does not exist" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.757859 5008 scope.go:117] "RemoveContainer" containerID="aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b" Mar 18 19:35:48 crc kubenswrapper[5008]: E0318 19:35:48.758153 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b\": container with ID starting with aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b not found: ID does not exist" containerID="aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.758176 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b"} err="failed to get container status \"aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b\": rpc error: code = NotFound desc = could not find container \"aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b\": container with ID starting with aff880aad7d7bf1318170226eaad6c19f3718023347f0a50730024140f80b91b not found: ID does not exist" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.758209 5008 scope.go:117] "RemoveContainer" containerID="632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862" Mar 18 19:35:48 crc kubenswrapper[5008]: E0318 19:35:48.760195 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862\": container with ID starting with 632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862 not found: ID does not exist" containerID="632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.760240 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862"} err="failed to get container status \"632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862\": rpc error: code = NotFound desc = could not find container \"632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862\": container with ID starting with 632a6bd32b2762d58b6716342b5ebd2bd5395ea9468de33e26df2820e8053862 not found: ID does not exist" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.881757 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8c131199-f3fc-473d-be6f-7f3f42b8c542-must-gather-output\") pod \"8c131199-f3fc-473d-be6f-7f3f42b8c542\" (UID: \"8c131199-f3fc-473d-be6f-7f3f42b8c542\") " Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.883731 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmdgk\" (UniqueName: \"kubernetes.io/projected/8c131199-f3fc-473d-be6f-7f3f42b8c542-kube-api-access-nmdgk\") pod \"8c131199-f3fc-473d-be6f-7f3f42b8c542\" (UID: \"8c131199-f3fc-473d-be6f-7f3f42b8c542\") " Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.889657 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c131199-f3fc-473d-be6f-7f3f42b8c542-kube-api-access-nmdgk" (OuterVolumeSpecName: "kube-api-access-nmdgk") pod "8c131199-f3fc-473d-be6f-7f3f42b8c542" (UID: "8c131199-f3fc-473d-be6f-7f3f42b8c542"). InnerVolumeSpecName "kube-api-access-nmdgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.986378 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmdgk\" (UniqueName: \"kubernetes.io/projected/8c131199-f3fc-473d-be6f-7f3f42b8c542-kube-api-access-nmdgk\") on node \"crc\" DevicePath \"\"" Mar 18 19:35:48 crc kubenswrapper[5008]: I0318 19:35:48.987534 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c131199-f3fc-473d-be6f-7f3f42b8c542-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8c131199-f3fc-473d-be6f-7f3f42b8c542" (UID: "8c131199-f3fc-473d-be6f-7f3f42b8c542"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:35:49 crc kubenswrapper[5008]: I0318 19:35:49.087772 5008 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8c131199-f3fc-473d-be6f-7f3f42b8c542-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 19:35:49 crc kubenswrapper[5008]: I0318 19:35:49.695823 5008 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vnkpt_must-gather-5trpz_8c131199-f3fc-473d-be6f-7f3f42b8c542/copy/0.log" Mar 18 19:35:49 crc kubenswrapper[5008]: I0318 19:35:49.696290 5008 scope.go:117] "RemoveContainer" containerID="708f496f9ab39dcd85911634502f4126814721f0e02eea2583fabea7b70db17d" Mar 18 19:35:49 crc kubenswrapper[5008]: I0318 19:35:49.696350 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnkpt/must-gather-5trpz" Mar 18 19:35:49 crc kubenswrapper[5008]: I0318 19:35:49.734979 5008 scope.go:117] "RemoveContainer" containerID="357cd3737a0fc9c7dbe2d52cefb39b0b126e017157debbb33ce559623517edac" Mar 18 19:35:50 crc kubenswrapper[5008]: I0318 19:35:50.214509 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6140f505-4751-45cc-98bc-c2e55e84ab6a" path="/var/lib/kubelet/pods/6140f505-4751-45cc-98bc-c2e55e84ab6a/volumes" Mar 18 19:35:50 crc kubenswrapper[5008]: I0318 19:35:50.216432 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c131199-f3fc-473d-be6f-7f3f42b8c542" path="/var/lib/kubelet/pods/8c131199-f3fc-473d-be6f-7f3f42b8c542/volumes" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.159149 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564376-l2gvr"] Mar 18 19:36:00 crc kubenswrapper[5008]: E0318 19:36:00.160478 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c131199-f3fc-473d-be6f-7f3f42b8c542" containerName="copy" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.160496 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c131199-f3fc-473d-be6f-7f3f42b8c542" containerName="copy" Mar 18 19:36:00 crc kubenswrapper[5008]: E0318 19:36:00.160537 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerName="extract-content" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.160546 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerName="extract-content" Mar 18 19:36:00 crc kubenswrapper[5008]: E0318 19:36:00.160639 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c131199-f3fc-473d-be6f-7f3f42b8c542" containerName="gather" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.160687 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c131199-f3fc-473d-be6f-7f3f42b8c542" containerName="gather" Mar 18 19:36:00 crc kubenswrapper[5008]: E0318 19:36:00.160722 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerName="registry-server" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.160761 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerName="registry-server" Mar 18 19:36:00 crc kubenswrapper[5008]: E0318 19:36:00.160784 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerName="extract-utilities" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.160796 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerName="extract-utilities" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.161130 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="6140f505-4751-45cc-98bc-c2e55e84ab6a" containerName="registry-server" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.161162 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c131199-f3fc-473d-be6f-7f3f42b8c542" containerName="gather" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.161181 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c131199-f3fc-473d-be6f-7f3f42b8c542" containerName="copy" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.161899 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564376-l2gvr" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.164140 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.169871 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.170228 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.184043 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564376-l2gvr"] Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.302005 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhzwq\" (UniqueName: \"kubernetes.io/projected/f51c972a-d026-4b82-aeaa-97e49411522a-kube-api-access-lhzwq\") pod \"auto-csr-approver-29564376-l2gvr\" (UID: \"f51c972a-d026-4b82-aeaa-97e49411522a\") " pod="openshift-infra/auto-csr-approver-29564376-l2gvr" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.404462 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhzwq\" (UniqueName: \"kubernetes.io/projected/f51c972a-d026-4b82-aeaa-97e49411522a-kube-api-access-lhzwq\") pod \"auto-csr-approver-29564376-l2gvr\" (UID: \"f51c972a-d026-4b82-aeaa-97e49411522a\") " pod="openshift-infra/auto-csr-approver-29564376-l2gvr" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.424016 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhzwq\" (UniqueName: \"kubernetes.io/projected/f51c972a-d026-4b82-aeaa-97e49411522a-kube-api-access-lhzwq\") pod \"auto-csr-approver-29564376-l2gvr\" (UID: \"f51c972a-d026-4b82-aeaa-97e49411522a\") " pod="openshift-infra/auto-csr-approver-29564376-l2gvr" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.488697 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564376-l2gvr" Mar 18 19:36:00 crc kubenswrapper[5008]: I0318 19:36:00.802125 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564376-l2gvr"] Mar 18 19:36:01 crc kubenswrapper[5008]: I0318 19:36:01.799667 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564376-l2gvr" event={"ID":"f51c972a-d026-4b82-aeaa-97e49411522a","Type":"ContainerStarted","Data":"5be5cc30f8c13fdaf2d9409349973bed95edd4e627b4d59d7f0d2de6fb83c169"} Mar 18 19:36:02 crc kubenswrapper[5008]: I0318 19:36:02.815511 5008 generic.go:334] "Generic (PLEG): container finished" podID="f51c972a-d026-4b82-aeaa-97e49411522a" containerID="0961185df6c63c845182957220ebb338a89d13793a0ffe0c48b43c1883fbae6e" exitCode=0 Mar 18 19:36:02 crc kubenswrapper[5008]: I0318 19:36:02.815634 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564376-l2gvr" event={"ID":"f51c972a-d026-4b82-aeaa-97e49411522a","Type":"ContainerDied","Data":"0961185df6c63c845182957220ebb338a89d13793a0ffe0c48b43c1883fbae6e"} Mar 18 19:36:04 crc kubenswrapper[5008]: I0318 19:36:04.130030 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564376-l2gvr" Mar 18 19:36:04 crc kubenswrapper[5008]: I0318 19:36:04.204618 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhzwq\" (UniqueName: \"kubernetes.io/projected/f51c972a-d026-4b82-aeaa-97e49411522a-kube-api-access-lhzwq\") pod \"f51c972a-d026-4b82-aeaa-97e49411522a\" (UID: \"f51c972a-d026-4b82-aeaa-97e49411522a\") " Mar 18 19:36:04 crc kubenswrapper[5008]: I0318 19:36:04.209445 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f51c972a-d026-4b82-aeaa-97e49411522a-kube-api-access-lhzwq" (OuterVolumeSpecName: "kube-api-access-lhzwq") pod "f51c972a-d026-4b82-aeaa-97e49411522a" (UID: "f51c972a-d026-4b82-aeaa-97e49411522a"). InnerVolumeSpecName "kube-api-access-lhzwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:36:04 crc kubenswrapper[5008]: I0318 19:36:04.307345 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhzwq\" (UniqueName: \"kubernetes.io/projected/f51c972a-d026-4b82-aeaa-97e49411522a-kube-api-access-lhzwq\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:04 crc kubenswrapper[5008]: I0318 19:36:04.831420 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564376-l2gvr" event={"ID":"f51c972a-d026-4b82-aeaa-97e49411522a","Type":"ContainerDied","Data":"5be5cc30f8c13fdaf2d9409349973bed95edd4e627b4d59d7f0d2de6fb83c169"} Mar 18 19:36:04 crc kubenswrapper[5008]: I0318 19:36:04.831484 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5be5cc30f8c13fdaf2d9409349973bed95edd4e627b4d59d7f0d2de6fb83c169" Mar 18 19:36:04 crc kubenswrapper[5008]: I0318 19:36:04.831483 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564376-l2gvr" Mar 18 19:36:05 crc kubenswrapper[5008]: I0318 19:36:05.214972 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564370-d58j9"] Mar 18 19:36:05 crc kubenswrapper[5008]: I0318 19:36:05.222314 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564370-d58j9"] Mar 18 19:36:06 crc kubenswrapper[5008]: I0318 19:36:06.212990 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be5fd66-baab-4624-a51e-17d49300c05a" path="/var/lib/kubelet/pods/1be5fd66-baab-4624-a51e-17d49300c05a/volumes" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.339250 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l5hff"] Mar 18 19:36:09 crc kubenswrapper[5008]: E0318 19:36:09.340404 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f51c972a-d026-4b82-aeaa-97e49411522a" containerName="oc" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.340424 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="f51c972a-d026-4b82-aeaa-97e49411522a" containerName="oc" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.340653 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="f51c972a-d026-4b82-aeaa-97e49411522a" containerName="oc" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.341952 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.346111 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5hff"] Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.511991 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-catalog-content\") pod \"redhat-marketplace-l5hff\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.512091 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-utilities\") pod \"redhat-marketplace-l5hff\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.512206 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b98l\" (UniqueName: \"kubernetes.io/projected/fb395746-ce79-4109-9116-1872b9e98e1a-kube-api-access-9b98l\") pod \"redhat-marketplace-l5hff\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.614292 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-catalog-content\") pod \"redhat-marketplace-l5hff\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.614398 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-utilities\") pod \"redhat-marketplace-l5hff\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.614431 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b98l\" (UniqueName: \"kubernetes.io/projected/fb395746-ce79-4109-9116-1872b9e98e1a-kube-api-access-9b98l\") pod \"redhat-marketplace-l5hff\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.615257 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-catalog-content\") pod \"redhat-marketplace-l5hff\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.615345 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-utilities\") pod \"redhat-marketplace-l5hff\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.637780 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b98l\" (UniqueName: \"kubernetes.io/projected/fb395746-ce79-4109-9116-1872b9e98e1a-kube-api-access-9b98l\") pod \"redhat-marketplace-l5hff\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.714478 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:09 crc kubenswrapper[5008]: I0318 19:36:09.972809 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5hff"] Mar 18 19:36:10 crc kubenswrapper[5008]: I0318 19:36:10.882548 5008 generic.go:334] "Generic (PLEG): container finished" podID="fb395746-ce79-4109-9116-1872b9e98e1a" containerID="c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522" exitCode=0 Mar 18 19:36:10 crc kubenswrapper[5008]: I0318 19:36:10.882807 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5hff" event={"ID":"fb395746-ce79-4109-9116-1872b9e98e1a","Type":"ContainerDied","Data":"c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522"} Mar 18 19:36:10 crc kubenswrapper[5008]: I0318 19:36:10.883023 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5hff" event={"ID":"fb395746-ce79-4109-9116-1872b9e98e1a","Type":"ContainerStarted","Data":"7ea6457bae8faa12c1e808e30ef62820b7f97991deef7fa1644e3bcbc07028f6"} Mar 18 19:36:11 crc kubenswrapper[5008]: I0318 19:36:11.898281 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5hff" event={"ID":"fb395746-ce79-4109-9116-1872b9e98e1a","Type":"ContainerStarted","Data":"0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe"} Mar 18 19:36:12 crc kubenswrapper[5008]: I0318 19:36:12.913812 5008 generic.go:334] "Generic (PLEG): container finished" podID="fb395746-ce79-4109-9116-1872b9e98e1a" containerID="0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe" exitCode=0 Mar 18 19:36:12 crc kubenswrapper[5008]: I0318 19:36:12.913928 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5hff" event={"ID":"fb395746-ce79-4109-9116-1872b9e98e1a","Type":"ContainerDied","Data":"0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe"} Mar 18 19:36:13 crc kubenswrapper[5008]: I0318 19:36:13.934909 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5hff" event={"ID":"fb395746-ce79-4109-9116-1872b9e98e1a","Type":"ContainerStarted","Data":"14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268"} Mar 18 19:36:13 crc kubenswrapper[5008]: I0318 19:36:13.965681 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l5hff" podStartSLOduration=2.541714705 podStartE2EDuration="4.965653218s" podCreationTimestamp="2026-03-18 19:36:09 +0000 UTC" firstStartedPulling="2026-03-18 19:36:10.885858026 +0000 UTC m=+5627.405331145" lastFinishedPulling="2026-03-18 19:36:13.309796579 +0000 UTC m=+5629.829269658" observedRunningTime="2026-03-18 19:36:13.964003274 +0000 UTC m=+5630.483476393" watchObservedRunningTime="2026-03-18 19:36:13.965653218 +0000 UTC m=+5630.485126337" Mar 18 19:36:19 crc kubenswrapper[5008]: I0318 19:36:19.715735 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:19 crc kubenswrapper[5008]: I0318 19:36:19.716240 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:19 crc kubenswrapper[5008]: I0318 19:36:19.769752 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:20 crc kubenswrapper[5008]: I0318 19:36:20.039905 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:20 crc kubenswrapper[5008]: I0318 19:36:20.090874 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5hff"] Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.026775 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l5hff" podUID="fb395746-ce79-4109-9116-1872b9e98e1a" containerName="registry-server" containerID="cri-o://14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268" gracePeriod=2 Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.464512 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.558675 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-utilities\") pod \"fb395746-ce79-4109-9116-1872b9e98e1a\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.558858 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b98l\" (UniqueName: \"kubernetes.io/projected/fb395746-ce79-4109-9116-1872b9e98e1a-kube-api-access-9b98l\") pod \"fb395746-ce79-4109-9116-1872b9e98e1a\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.558911 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-catalog-content\") pod \"fb395746-ce79-4109-9116-1872b9e98e1a\" (UID: \"fb395746-ce79-4109-9116-1872b9e98e1a\") " Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.559488 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-utilities" (OuterVolumeSpecName: "utilities") pod "fb395746-ce79-4109-9116-1872b9e98e1a" (UID: "fb395746-ce79-4109-9116-1872b9e98e1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.566060 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb395746-ce79-4109-9116-1872b9e98e1a-kube-api-access-9b98l" (OuterVolumeSpecName: "kube-api-access-9b98l") pod "fb395746-ce79-4109-9116-1872b9e98e1a" (UID: "fb395746-ce79-4109-9116-1872b9e98e1a"). InnerVolumeSpecName "kube-api-access-9b98l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.586546 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb395746-ce79-4109-9116-1872b9e98e1a" (UID: "fb395746-ce79-4109-9116-1872b9e98e1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.661059 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.661091 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b98l\" (UniqueName: \"kubernetes.io/projected/fb395746-ce79-4109-9116-1872b9e98e1a-kube-api-access-9b98l\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:22 crc kubenswrapper[5008]: I0318 19:36:22.661101 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb395746-ce79-4109-9116-1872b9e98e1a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.040172 5008 generic.go:334] "Generic (PLEG): container finished" podID="fb395746-ce79-4109-9116-1872b9e98e1a" containerID="14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268" exitCode=0 Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.040214 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5hff" event={"ID":"fb395746-ce79-4109-9116-1872b9e98e1a","Type":"ContainerDied","Data":"14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268"} Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.040279 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5hff" event={"ID":"fb395746-ce79-4109-9116-1872b9e98e1a","Type":"ContainerDied","Data":"7ea6457bae8faa12c1e808e30ef62820b7f97991deef7fa1644e3bcbc07028f6"} Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.040310 5008 scope.go:117] "RemoveContainer" containerID="14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.040308 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5hff" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.080966 5008 scope.go:117] "RemoveContainer" containerID="0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.093763 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5hff"] Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.100787 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5hff"] Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.118633 5008 scope.go:117] "RemoveContainer" containerID="c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.169476 5008 scope.go:117] "RemoveContainer" containerID="14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268" Mar 18 19:36:23 crc kubenswrapper[5008]: E0318 19:36:23.169970 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268\": container with ID starting with 14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268 not found: ID does not exist" containerID="14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.170020 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268"} err="failed to get container status \"14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268\": rpc error: code = NotFound desc = could not find container \"14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268\": container with ID starting with 14c0228a597265a98b1fd9da7dcc1b85b50bbd1500a909a8ac9765d112d3e268 not found: ID does not exist" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.170051 5008 scope.go:117] "RemoveContainer" containerID="0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe" Mar 18 19:36:23 crc kubenswrapper[5008]: E0318 19:36:23.170534 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe\": container with ID starting with 0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe not found: ID does not exist" containerID="0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.170585 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe"} err="failed to get container status \"0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe\": rpc error: code = NotFound desc = could not find container \"0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe\": container with ID starting with 0b6d6e4aecb0d75be19bcfc0482eeee5076abe136bdfe89abd4abbb83eaf81fe not found: ID does not exist" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.170609 5008 scope.go:117] "RemoveContainer" containerID="c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522" Mar 18 19:36:23 crc kubenswrapper[5008]: E0318 19:36:23.170908 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522\": container with ID starting with c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522 not found: ID does not exist" containerID="c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522" Mar 18 19:36:23 crc kubenswrapper[5008]: I0318 19:36:23.170940 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522"} err="failed to get container status \"c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522\": rpc error: code = NotFound desc = could not find container \"c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522\": container with ID starting with c50bbfd3988d843eb82bca29d73f721acb3272c193d22f68ace2faea7039d522 not found: ID does not exist" Mar 18 19:36:24 crc kubenswrapper[5008]: I0318 19:36:24.208504 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb395746-ce79-4109-9116-1872b9e98e1a" path="/var/lib/kubelet/pods/fb395746-ce79-4109-9116-1872b9e98e1a/volumes" Mar 18 19:36:24 crc kubenswrapper[5008]: I0318 19:36:24.459713 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:36:24 crc kubenswrapper[5008]: I0318 19:36:24.459775 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:36:45 crc kubenswrapper[5008]: I0318 19:36:45.491780 5008 scope.go:117] "RemoveContainer" containerID="10cc64796f9e634bcb7ab15134249efbcf1ebcdaa30ec7bc541caa2258fa378c" Mar 18 19:36:54 crc kubenswrapper[5008]: I0318 19:36:54.459749 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:36:54 crc kubenswrapper[5008]: I0318 19:36:54.460405 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.169374 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zrj6b"] Mar 18 19:36:55 crc kubenswrapper[5008]: E0318 19:36:55.169827 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb395746-ce79-4109-9116-1872b9e98e1a" containerName="registry-server" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.169851 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb395746-ce79-4109-9116-1872b9e98e1a" containerName="registry-server" Mar 18 19:36:55 crc kubenswrapper[5008]: E0318 19:36:55.169873 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb395746-ce79-4109-9116-1872b9e98e1a" containerName="extract-content" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.169881 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb395746-ce79-4109-9116-1872b9e98e1a" containerName="extract-content" Mar 18 19:36:55 crc kubenswrapper[5008]: E0318 19:36:55.169894 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb395746-ce79-4109-9116-1872b9e98e1a" containerName="extract-utilities" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.169903 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb395746-ce79-4109-9116-1872b9e98e1a" containerName="extract-utilities" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.170125 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb395746-ce79-4109-9116-1872b9e98e1a" containerName="registry-server" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.171578 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.178680 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrj6b"] Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.364364 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-catalog-content\") pod \"redhat-operators-zrj6b\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.364406 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-utilities\") pod \"redhat-operators-zrj6b\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.364444 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgz6m\" (UniqueName: \"kubernetes.io/projected/95d3986f-8584-4dbf-82cc-a3d6160a6769-kube-api-access-dgz6m\") pod \"redhat-operators-zrj6b\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.466135 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-catalog-content\") pod \"redhat-operators-zrj6b\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.466585 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-utilities\") pod \"redhat-operators-zrj6b\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.466649 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgz6m\" (UniqueName: \"kubernetes.io/projected/95d3986f-8584-4dbf-82cc-a3d6160a6769-kube-api-access-dgz6m\") pod \"redhat-operators-zrj6b\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.466772 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-catalog-content\") pod \"redhat-operators-zrj6b\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.467017 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-utilities\") pod \"redhat-operators-zrj6b\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.484704 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgz6m\" (UniqueName: \"kubernetes.io/projected/95d3986f-8584-4dbf-82cc-a3d6160a6769-kube-api-access-dgz6m\") pod \"redhat-operators-zrj6b\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.512940 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:36:55 crc kubenswrapper[5008]: I0318 19:36:55.976190 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrj6b"] Mar 18 19:36:55 crc kubenswrapper[5008]: W0318 19:36:55.984741 5008 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95d3986f_8584_4dbf_82cc_a3d6160a6769.slice/crio-f61b7235c80a394e0fcaffe3a70869f8846449b04479501c17ff120dd8f9d584 WatchSource:0}: Error finding container f61b7235c80a394e0fcaffe3a70869f8846449b04479501c17ff120dd8f9d584: Status 404 returned error can't find the container with id f61b7235c80a394e0fcaffe3a70869f8846449b04479501c17ff120dd8f9d584 Mar 18 19:36:56 crc kubenswrapper[5008]: I0318 19:36:56.433931 5008 generic.go:334] "Generic (PLEG): container finished" podID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerID="31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0" exitCode=0 Mar 18 19:36:56 crc kubenswrapper[5008]: I0318 19:36:56.433973 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj6b" event={"ID":"95d3986f-8584-4dbf-82cc-a3d6160a6769","Type":"ContainerDied","Data":"31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0"} Mar 18 19:36:56 crc kubenswrapper[5008]: I0318 19:36:56.433997 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj6b" event={"ID":"95d3986f-8584-4dbf-82cc-a3d6160a6769","Type":"ContainerStarted","Data":"f61b7235c80a394e0fcaffe3a70869f8846449b04479501c17ff120dd8f9d584"} Mar 18 19:36:58 crc kubenswrapper[5008]: I0318 19:36:58.461547 5008 generic.go:334] "Generic (PLEG): container finished" podID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerID="1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719" exitCode=0 Mar 18 19:36:58 crc kubenswrapper[5008]: I0318 19:36:58.461597 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj6b" event={"ID":"95d3986f-8584-4dbf-82cc-a3d6160a6769","Type":"ContainerDied","Data":"1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719"} Mar 18 19:36:59 crc kubenswrapper[5008]: I0318 19:36:59.473769 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj6b" event={"ID":"95d3986f-8584-4dbf-82cc-a3d6160a6769","Type":"ContainerStarted","Data":"5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0"} Mar 18 19:36:59 crc kubenswrapper[5008]: I0318 19:36:59.492818 5008 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zrj6b" podStartSLOduration=1.779217623 podStartE2EDuration="4.492797291s" podCreationTimestamp="2026-03-18 19:36:55 +0000 UTC" firstStartedPulling="2026-03-18 19:36:56.435579731 +0000 UTC m=+5672.955052810" lastFinishedPulling="2026-03-18 19:36:59.149159399 +0000 UTC m=+5675.668632478" observedRunningTime="2026-03-18 19:36:59.492676938 +0000 UTC m=+5676.012150037" watchObservedRunningTime="2026-03-18 19:36:59.492797291 +0000 UTC m=+5676.012270370" Mar 18 19:37:05 crc kubenswrapper[5008]: I0318 19:37:05.513517 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:37:05 crc kubenswrapper[5008]: I0318 19:37:05.514053 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:37:06 crc kubenswrapper[5008]: I0318 19:37:06.564378 5008 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zrj6b" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerName="registry-server" probeResult="failure" output=< Mar 18 19:37:06 crc kubenswrapper[5008]: timeout: failed to connect service ":50051" within 1s Mar 18 19:37:06 crc kubenswrapper[5008]: > Mar 18 19:37:15 crc kubenswrapper[5008]: I0318 19:37:15.566362 5008 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:37:15 crc kubenswrapper[5008]: I0318 19:37:15.616203 5008 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:37:15 crc kubenswrapper[5008]: I0318 19:37:15.805598 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrj6b"] Mar 18 19:37:16 crc kubenswrapper[5008]: I0318 19:37:16.629382 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zrj6b" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerName="registry-server" containerID="cri-o://5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0" gracePeriod=2 Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.189824 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.295651 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-utilities\") pod \"95d3986f-8584-4dbf-82cc-a3d6160a6769\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.295808 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgz6m\" (UniqueName: \"kubernetes.io/projected/95d3986f-8584-4dbf-82cc-a3d6160a6769-kube-api-access-dgz6m\") pod \"95d3986f-8584-4dbf-82cc-a3d6160a6769\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.295971 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-catalog-content\") pod \"95d3986f-8584-4dbf-82cc-a3d6160a6769\" (UID: \"95d3986f-8584-4dbf-82cc-a3d6160a6769\") " Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.297707 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-utilities" (OuterVolumeSpecName: "utilities") pod "95d3986f-8584-4dbf-82cc-a3d6160a6769" (UID: "95d3986f-8584-4dbf-82cc-a3d6160a6769"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.309149 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d3986f-8584-4dbf-82cc-a3d6160a6769-kube-api-access-dgz6m" (OuterVolumeSpecName: "kube-api-access-dgz6m") pod "95d3986f-8584-4dbf-82cc-a3d6160a6769" (UID: "95d3986f-8584-4dbf-82cc-a3d6160a6769"). InnerVolumeSpecName "kube-api-access-dgz6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.398496 5008 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.398538 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgz6m\" (UniqueName: \"kubernetes.io/projected/95d3986f-8584-4dbf-82cc-a3d6160a6769-kube-api-access-dgz6m\") on node \"crc\" DevicePath \"\"" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.450755 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95d3986f-8584-4dbf-82cc-a3d6160a6769" (UID: "95d3986f-8584-4dbf-82cc-a3d6160a6769"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.499694 5008 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95d3986f-8584-4dbf-82cc-a3d6160a6769-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.647224 5008 generic.go:334] "Generic (PLEG): container finished" podID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerID="5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0" exitCode=0 Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.647481 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj6b" event={"ID":"95d3986f-8584-4dbf-82cc-a3d6160a6769","Type":"ContainerDied","Data":"5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0"} Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.647855 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrj6b" event={"ID":"95d3986f-8584-4dbf-82cc-a3d6160a6769","Type":"ContainerDied","Data":"f61b7235c80a394e0fcaffe3a70869f8846449b04479501c17ff120dd8f9d584"} Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.647905 5008 scope.go:117] "RemoveContainer" containerID="5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.647689 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrj6b" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.680858 5008 scope.go:117] "RemoveContainer" containerID="1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.707231 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrj6b"] Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.722723 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zrj6b"] Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.731949 5008 scope.go:117] "RemoveContainer" containerID="31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.760315 5008 scope.go:117] "RemoveContainer" containerID="5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0" Mar 18 19:37:17 crc kubenswrapper[5008]: E0318 19:37:17.761381 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0\": container with ID starting with 5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0 not found: ID does not exist" containerID="5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.761551 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0"} err="failed to get container status \"5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0\": rpc error: code = NotFound desc = could not find container \"5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0\": container with ID starting with 5469fdb8a50115cdc60df76097db2023734b9a4e30b3c7e6d026d5c712de54f0 not found: ID does not exist" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.761757 5008 scope.go:117] "RemoveContainer" containerID="1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719" Mar 18 19:37:17 crc kubenswrapper[5008]: E0318 19:37:17.762460 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719\": container with ID starting with 1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719 not found: ID does not exist" containerID="1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.762728 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719"} err="failed to get container status \"1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719\": rpc error: code = NotFound desc = could not find container \"1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719\": container with ID starting with 1537ccb5a7590ef58f514f6521319af0ef677464ecfa88015f0cbaa025ecf719 not found: ID does not exist" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.762904 5008 scope.go:117] "RemoveContainer" containerID="31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0" Mar 18 19:37:17 crc kubenswrapper[5008]: E0318 19:37:17.763724 5008 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0\": container with ID starting with 31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0 not found: ID does not exist" containerID="31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0" Mar 18 19:37:17 crc kubenswrapper[5008]: I0318 19:37:17.763796 5008 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0"} err="failed to get container status \"31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0\": rpc error: code = NotFound desc = could not find container \"31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0\": container with ID starting with 31027980ed4ea2e77cbfca60f0fc1c6c6fa352c4b71efaa34467e65cfd5731c0 not found: ID does not exist" Mar 18 19:37:18 crc kubenswrapper[5008]: I0318 19:37:18.215746 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" path="/var/lib/kubelet/pods/95d3986f-8584-4dbf-82cc-a3d6160a6769/volumes" Mar 18 19:37:24 crc kubenswrapper[5008]: I0318 19:37:24.460465 5008 patch_prober.go:28] interesting pod/machine-config-daemon-crzrt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 19:37:24 crc kubenswrapper[5008]: I0318 19:37:24.461262 5008 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 19:37:24 crc kubenswrapper[5008]: I0318 19:37:24.461328 5008 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" Mar 18 19:37:24 crc kubenswrapper[5008]: I0318 19:37:24.462167 5008 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb68cfa01f563fe658c037b538850cc91db5ca75954107e62dbb9316352fbb53"} pod="openshift-machine-config-operator/machine-config-daemon-crzrt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 19:37:24 crc kubenswrapper[5008]: I0318 19:37:24.462294 5008 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerName="machine-config-daemon" containerID="cri-o://bb68cfa01f563fe658c037b538850cc91db5ca75954107e62dbb9316352fbb53" gracePeriod=600 Mar 18 19:37:24 crc kubenswrapper[5008]: E0318 19:37:24.607128 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:37:24 crc kubenswrapper[5008]: I0318 19:37:24.737280 5008 generic.go:334] "Generic (PLEG): container finished" podID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" containerID="bb68cfa01f563fe658c037b538850cc91db5ca75954107e62dbb9316352fbb53" exitCode=0 Mar 18 19:37:24 crc kubenswrapper[5008]: I0318 19:37:24.737409 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" event={"ID":"de73a23f-7b17-40f3-bb5d-14c8bff178b9","Type":"ContainerDied","Data":"bb68cfa01f563fe658c037b538850cc91db5ca75954107e62dbb9316352fbb53"} Mar 18 19:37:24 crc kubenswrapper[5008]: I0318 19:37:24.737734 5008 scope.go:117] "RemoveContainer" containerID="4ed354de23509ced81330dce6064f0e54cdb4d741d3f848d673994d0c464ccba" Mar 18 19:37:24 crc kubenswrapper[5008]: I0318 19:37:24.739787 5008 scope.go:117] "RemoveContainer" containerID="bb68cfa01f563fe658c037b538850cc91db5ca75954107e62dbb9316352fbb53" Mar 18 19:37:24 crc kubenswrapper[5008]: E0318 19:37:24.740993 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:37:39 crc kubenswrapper[5008]: I0318 19:37:39.199061 5008 scope.go:117] "RemoveContainer" containerID="bb68cfa01f563fe658c037b538850cc91db5ca75954107e62dbb9316352fbb53" Mar 18 19:37:39 crc kubenswrapper[5008]: E0318 19:37:39.200067 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:37:45 crc kubenswrapper[5008]: I0318 19:37:45.644040 5008 scope.go:117] "RemoveContainer" containerID="b9f1a321ec3bfc10d5c6a2d5aaa5c93e4aee395ee3cf5c0d21d0b94b9099284b" Mar 18 19:37:54 crc kubenswrapper[5008]: I0318 19:37:54.202354 5008 scope.go:117] "RemoveContainer" containerID="bb68cfa01f563fe658c037b538850cc91db5ca75954107e62dbb9316352fbb53" Mar 18 19:37:54 crc kubenswrapper[5008]: E0318 19:37:54.203085 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.151146 5008 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564378-2pn7t"] Mar 18 19:38:00 crc kubenswrapper[5008]: E0318 19:38:00.154920 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.154941 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[5008]: E0318 19:38:00.154960 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerName="extract-content" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.154966 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerName="extract-content" Mar 18 19:38:00 crc kubenswrapper[5008]: E0318 19:38:00.155089 5008 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerName="extract-utilities" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.155098 5008 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerName="extract-utilities" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.155250 5008 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d3986f-8584-4dbf-82cc-a3d6160a6769" containerName="registry-server" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.161002 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564378-2pn7t" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.163733 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564378-2pn7t"] Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.164693 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.169232 5008 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.169628 5008 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8dgsj" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.268515 5008 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s629g\" (UniqueName: \"kubernetes.io/projected/54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf-kube-api-access-s629g\") pod \"auto-csr-approver-29564378-2pn7t\" (UID: \"54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf\") " pod="openshift-infra/auto-csr-approver-29564378-2pn7t" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.371012 5008 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s629g\" (UniqueName: \"kubernetes.io/projected/54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf-kube-api-access-s629g\") pod \"auto-csr-approver-29564378-2pn7t\" (UID: \"54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf\") " pod="openshift-infra/auto-csr-approver-29564378-2pn7t" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.401977 5008 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s629g\" (UniqueName: \"kubernetes.io/projected/54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf-kube-api-access-s629g\") pod \"auto-csr-approver-29564378-2pn7t\" (UID: \"54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf\") " pod="openshift-infra/auto-csr-approver-29564378-2pn7t" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.493815 5008 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564378-2pn7t" Mar 18 19:38:00 crc kubenswrapper[5008]: I0318 19:38:00.948381 5008 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564378-2pn7t"] Mar 18 19:38:01 crc kubenswrapper[5008]: I0318 19:38:01.103057 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564378-2pn7t" event={"ID":"54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf","Type":"ContainerStarted","Data":"b52da8b1870a02ddf01b2cfeed6d54df0e5a1a1b7a2c815be5ea41a0840ebab7"} Mar 18 19:38:04 crc kubenswrapper[5008]: I0318 19:38:04.128729 5008 generic.go:334] "Generic (PLEG): container finished" podID="54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf" containerID="53d4d54b6e2951fae8d82cffd76b444aa36a9f9e6b95f645176cc6473fb251df" exitCode=0 Mar 18 19:38:04 crc kubenswrapper[5008]: I0318 19:38:04.128779 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564378-2pn7t" event={"ID":"54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf","Type":"ContainerDied","Data":"53d4d54b6e2951fae8d82cffd76b444aa36a9f9e6b95f645176cc6473fb251df"} Mar 18 19:38:05 crc kubenswrapper[5008]: I0318 19:38:05.454999 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564378-2pn7t" Mar 18 19:38:05 crc kubenswrapper[5008]: I0318 19:38:05.576041 5008 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s629g\" (UniqueName: \"kubernetes.io/projected/54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf-kube-api-access-s629g\") pod \"54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf\" (UID: \"54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf\") " Mar 18 19:38:05 crc kubenswrapper[5008]: I0318 19:38:05.584321 5008 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf-kube-api-access-s629g" (OuterVolumeSpecName: "kube-api-access-s629g") pod "54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf" (UID: "54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf"). InnerVolumeSpecName "kube-api-access-s629g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 19:38:05 crc kubenswrapper[5008]: I0318 19:38:05.677803 5008 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s629g\" (UniqueName: \"kubernetes.io/projected/54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf-kube-api-access-s629g\") on node \"crc\" DevicePath \"\"" Mar 18 19:38:06 crc kubenswrapper[5008]: I0318 19:38:06.148630 5008 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564378-2pn7t" event={"ID":"54d100e4-e1ea-4c3c-9ba4-278f3d9fbebf","Type":"ContainerDied","Data":"b52da8b1870a02ddf01b2cfeed6d54df0e5a1a1b7a2c815be5ea41a0840ebab7"} Mar 18 19:38:06 crc kubenswrapper[5008]: I0318 19:38:06.148672 5008 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b52da8b1870a02ddf01b2cfeed6d54df0e5a1a1b7a2c815be5ea41a0840ebab7" Mar 18 19:38:06 crc kubenswrapper[5008]: I0318 19:38:06.148674 5008 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564378-2pn7t" Mar 18 19:38:06 crc kubenswrapper[5008]: I0318 19:38:06.563640 5008 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564372-p5twb"] Mar 18 19:38:06 crc kubenswrapper[5008]: I0318 19:38:06.576289 5008 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564372-p5twb"] Mar 18 19:38:08 crc kubenswrapper[5008]: I0318 19:38:08.214113 5008 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b" path="/var/lib/kubelet/pods/d7edaa8c-5eb4-4df7-b896-5e9cfbabe91b/volumes" Mar 18 19:38:09 crc kubenswrapper[5008]: I0318 19:38:09.198522 5008 scope.go:117] "RemoveContainer" containerID="bb68cfa01f563fe658c037b538850cc91db5ca75954107e62dbb9316352fbb53" Mar 18 19:38:09 crc kubenswrapper[5008]: E0318 19:38:09.199027 5008 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crzrt_openshift-machine-config-operator(de73a23f-7b17-40f3-bb5d-14c8bff178b9)\"" pod="openshift-machine-config-operator/machine-config-daemon-crzrt" podUID="de73a23f-7b17-40f3-bb5d-14c8bff178b9"